The Ethics of Metrics: An Interview with Deborah Kennedy


In this interview, Deborah Kennedy talks about her upcoming presentation at STARWEST 2014, the importance of using data wisely, why executive messaging is more important than people acknowledge it for, and the psychology behind the small things when it comes to metrics.

In this interview, Deborah Kennedy talks about her upcoming presentation at STARWEST 2014, the importance of using data wisely, why executive messaging is more important than people acknowledge it for, and the psychology behind the small things when it comes to metrics.


Cameron Philipp-Edmonds: Today, we have Deborah Kennedy and she'll be speaking at STARWEST 2014 which is October 12 through October 17 and she has given a presentation titled: "Testers, Use Metrics Wisely or Don't Use Them At All". Thank you for joining us today, Deborah.

Deborah Kennedy: Thank you for having me.

Cameron Philipp-Edmonds: To start things off, can you tell us a bit about yourself and your role at Wipro Technologies?

Deborah Kennedy: I've been in the QA industry for almost twenty years. I started out originally as a developer, but got involved to some QA work and enjoyed it so much that I moved over to that department. In the last eight to ten years, I've been specializing in process improvement and executive messaging.

Up until just recently I was working with Wipro Technologies. I've just changed over to a position with Aditi Technologies and the roles are fairly similar. I am a client engagement manager and my job is to coordinate between the client and the vendor in delivery of software testing services.

This is a strategic and long-term planning position and to help partner with the client and to bring innovative thought leadership to the project.

Cameron Philipp-Edmonds: Now, you're a firm believer that communication and metrics go hand-in-hand and for the most part, people tend to really separate numbers and language. Kind of the left-brain, right-brain deal. So, what about metrics or language causes you to feel like they're a cohesive team?

Deborah Kennedy: To me, metrics is basic communication whether you use a language or numbers. Communication boils down to a target trying to communicate an idea. I'm sorry, I mean a source communicating an idea to a target. And whichever channel you use to communicate there's always layers of complexity to get through to get that message across. Whether your message is between you and someone on the phone, you're a teacher and you're talking with a student, or you're an artist and you put your piece in a gallery.

All of these ways you are trying to communicate an idea. And with metrics these are graphical ideas; concepts that you're trying to project to a target audience.

To me the whole crux of properly used metrics is understanding not only the message that the client needs to hear, but how you've put together the message and how that's perceived. So, you can't just throw stuff out there and say, "Okay. There's data out there." You have to be able to craft it in a way that you're responsible for that message. I think a lot of people see metrics as a deliverable. It's just something I produce. But it's not. In its purest form, it's the art of getting your idea properly translated.

Cameron Philipp-Edmonds: Okay. So, it's kind of like an extension of yourself in that project.

Deborah Kennedy: Right. It's your voice. For QA it's really important to be able to get your voice out there. I hear a lot of people frustrated. They feel like QA is sort of relegated to this back closet and sort of treated as second class citizens. I have heard that over and over. In fact, I was at a conference not too long ago and I overheard several different conversations all around this sort of frustrated, limited view of what QA is. And the people not having a respect for that role.

I think a lot of that is that we as QA individuals don't feel like our voices being respected, heard, or understood. I think there's a lot that we can do about that if we understand how to craft that message properly. I think we have a lot more control over that than we think we do.

Cameron Philipp-Edmonds: Now, kind of under the umbrella of the ethics of metrics, you're kind of a proponent of the idea that if you cannot use data wisely or properly, you really shouldn't use it at all. Really why is that? Because it seems like something would be better than nothing.

Deborah Kennedy: Well, it reminds me of something that our mothers told us when we were children. If you can't something nice, don't say anything at all.

Cameron Philipp-Edmonds: Right.

Deborah Kennedy: It's not that we only want to give good news, but we want to be careful. I think it's important that we deliver our message ethically. And when I say ethically, I don't mean in a Sunday school sort of way. I mean there are certain values that we as individuals have but also that companies have, right?

Company ethics don't necessarily line up with personal ethics. Company ethics are we are beholding to our stakeholders. We have to make a profit. We need to minimize expense. These are the ethics that they have to live by. And if you understand and if your message transgresses the points that are important to you as an individual and to your company as a group then your behavior will be perceived as unethical.

In worst case scenario, deceptive.

Once you start behaving in a way that is perceived as deceptive or unethical. Whether or not you intend to be, you seriously undermine your partnershipping abilities with your co-workers, with your clients, with your vendors. So, you have to build a reputation of owning an impeachable ethics in order for your voice to be trusted. And I think how we do metrics—I think if we don't do it thoughtfully then we end up shooting ourselves on the foot as far as trying to get ourselves heard.

So, really the message that I have for this is not just creating metrics as a document but really thinking about how they're being perceived.

Cameron Philipp-Edmonds: You talked about when they're perceived, sometimes they could be deceptive. So, what are some common ways that metrics are skewed? When they kind of give the false sense of what's really actually going on.

Deborah Kennedy: Well, in my presentation, I talked about three different types of deception and deception isn't necessarily a bad word. It's not intentional. The first way is through ignorance because you may not understand how it's being perceived. The second one is through bias. You may highlight something or suppress something slightly in order to make your point seem more valuable. And the third way is straight through malice when people alter the graphs.

I talk about the presentation's different ways of detecting that, but I would say across the board in my experience, the biggest way that metrics are skewed to be presented in what could be perceived as unethical is the wrong metric is provided. So, for example, if I'm a program manager and you're a tester and I'm saying, "How's the project going?" And you tell me, "Well, we have twelve defects." Well, that's great. That's useful information somewhere, but that doesn't answer my question.

Cameron Philipp-Edmonds: Right.

Deborah Kennedy: So, not answering the question is the biggest skewing that I come across.

Cameron Philipp-Edmonds: Okay. Now, talking about not being able to answer the question being kind of skewing. Is there an easy way to tell if the metrics are skewed? I mean is there an easy way to tell if the question is not being answered?

Deborah Kennedy: Well, I would say first off I'm always by default slightly suspicious of any graph that I didn't know was coming.

Cameron Philipp-Edmonds: Okay. All right.

Deborah Kennedy: So, what I do with my clients is we talk. So, for example, I have one graph that I've used recently. We call it Feature Health. Okay. So, there's the different areas within a program and we were trying to decide what comparatively between the features.

How good and solid is the code? Which ones are sort of danger areas so if we know that we're getting a release in that area there's going to be a problem. So, we had several conversations with the client that said, "Okay. How do you define health? I know how I would, but how do you define health?"

And we had many conversations about this and we said, "Okay. Well, we can use pass rate. We can use customer feedback. We can use defect notes." And eventually, we got down to an algorithm that said, "This is how we think represents health."

So, we created this graph and with this bar graph we say here's the representation of health. And then we had a decision about, "Okay. Why do we care about the health? Is there's something that we're going to do if it's not healthy?"

So, we created a trigger lane that if anything came below that, the health drop below that, some trigger happened. So, it could be that we scheduled more time in the release for testing if those areas were impacted or we got more resources because we predicted we were going to find more defects in these low health areas.

So, when I handed at the end of the release. When that report got handed to the client they were not at all surprised. They knew what that was because they have been part of building that because that was an ongoing interactive communication between us. And so, if I had just walked in and go, "Here is the feature health." Then the questions were, "Where are you getting this data? What's your algorithm?" And so, I'm a big proponent of collaboration on metrics. Never surprising someone.

Now, I will say there's one caveat. Every now and again I'll do some data analysis and I'll sort of go through there and see if anything interesting pops up and I'll go to my clients that, "This is what I was doing. This interesting thing popped up. I wonder if that means something."

And we'll talk about how I pulled it and what does that mean and whether if that's something we need to track. But for the most part, somebody just walks up and says, "Oh. Look what I have."

Cameron Philipp-Edmonds: Right.

Deborah Kennedy: If it shows up in the report, I'm...

Cameron Philipp-Edmonds: Suspicious.

Deborah Kennedy: And the suspicion starts.

Cameron Philipp-Edmonds: Okay. So, once that kind of suspicion creeps into your mind and once that kind of theory or thought of deception is happening, is there a way to salvage that? Is there a way to fix that or is there just...

Deborah Kennedy: Yeah. A lot of times, what I've seen is in the cases where someone put together a graph just because they think that it needs to be. My default example for that is here are all the defects by priority and by severity. Okay, that's good. But my first question on all of these is, "What question are you answering?"

I challenge my staff on that all the time. What question are you answering? And if your answer is, "Well, I don't really know." Then that's not a useful metric. If you're not answering someone's question, you're wasting people's time.

Cameron Philipp-Edmonds: Okay.

Deborah Kennedy: So, you take that and you step back and you say, "What questions are being answered and how can I present this data in a way that answers that question?"

So, I had a really interesting conversation with someone who's trying to do some executive messaging and when I say executive messaging, I mean the communications to pull the metrics up to a slightly higher level that they can be passed off to executives and they get the gist of what's going on.

And again, it's really about answering the right questions. So, I started this session with my clients, and what questions are your executives asking? When they said, "Well, I don't really know." I said, "Okay. What do they talk about the most?"

Right? Are they talking about we need to be faster or we need to make more money or we need to ask a few more people? What is it that they're constantly upset about. That's the question they're really asking me. They're not really in a question form, but that's the question.

So, if their concern is about speed then we need to be building metrics in graphs to explain to them what our speed is, whether it's getting better or worse, what's part of that, what builds into that speed. So, those are the questions they really want answers to.

Cameron Philipp-Edmonds: Okay. Now, moving away from your presentation here a little bit. In the last twenty years, you spent time consulting and streaming QA departments, but you also worked really well to bring awareness to IT's role in executive messaging. Can you tell us exactly what executive messaging is and why it so often goes unnoticed?

Deborah Kennedy: Well, it's interesting how many opportunities we really do get to get our voices heard and you have to have solid data. Your voice has to have some meaning and you have to build a reputation of useful data.

So, I worked with my development counterparts, with my program development counterparts, my business analysis counterparts, and we look in sort of an agile whole team approach, we look at what is it that the executives want, how can we as a group bubble up all of this amazingly, overwhelming information into something that's useful for them?

My rule of thumb is when I hand something, a piece of paper or a show of slide to an executive like a C-level.

Cameron Philipp-Edmonds: Right.

Deborah Kennedy: They should be able to look at it for thirty seconds and again, keep in mind, this is not something that they're surprised about. We've talked about it and we've talked about when we pitched, this is what we're going to present you. Right. We explained it before they actually saw the data.

So, they can look at it for thirty seconds and they know right off: good, bad and maybe somewhere in the middle. Now, if they want to take more time to dig down, they can, but the key is they don't have to.

Cameron Philipp-Edmonds: Have to. Right. Okay.

Deborah Kennedy: Now, the thing is we're trying to be as consistent as possible. So, I have two managers who had completely different projects but we revamped the status reports for them and they were very similar in style.

And one manager could be walking past the other manager's desk and see it laying there not taking the time to read it or anything but just glancing at it would know, "Oh. They're doing pretty good." Or "There's a problem over there."

Just because there's so much consistency and understanding and partnership on how we're building that message.

Cameron Philipp-Edmonds: Okay. And you've also collected a wealth of information about the pay-off of understanding the impact of the small details. What exactly are the small details?

Deborah Kennedy: This is really interesting because years ago I read an article by Stephen Kosslyn. At the time, it was 1989. He was the the Director of Social Sciences at Harvard University.

He wrote a paper called Understanding Graphs and Charts and it is one of the most detailed papers on details that I have ever read. In it he talks about graphs. You just think about a standard bar chart. He breaks graph stand into three areas, the syntactic, the semantic, and the pragmatic. Or what I call the see, know, and understand.

Cameron Philipp-Edmonds: Okay.

Deborah Kennedy: So, he says all right. When you pick up a piece of paper that has a chart on it and you look at it and you look away, you saw it on the see level, on the syntactic level. You read it's got colors. It's the same thing that happens to you if you look at your watch to see what time it is and you look away and you think, "Okay. I still don't know what time it is. How did that happen?" Right?

Your brain recognizes the watch hand's and you know your numbers, but it didn't get any further in. So, that's it. That's it. The syntactic level.

Cameron Philipp-Edmonds: Okay.

Deborah Kennedy: Semantic level is you understand what that means. It's three o'clock. You got it. Right? And the pragmatic level is it's three o'clock and I am half an hour late to go to this meeting. You understand that. So, he breaks graphs down into these three different areas.

The graphs themselves are made up of four different functions and it's really super nerdy and geeky to get into. I'll talk about it forever. In fact, one of the original presentations I was thinking about doing was The Cognitive Psychology of Metrics Presentation where I talk about his work in neuroscience, in cognition.

But he breaks his down. So, it was more of a matrix. So, we're looking at the pragmatic level of the label section of the graph and there's a series of questions that he walks through. The point is that you really are doing yourself a disservice if you don't understand how your audience is taking in your information and all of the barriers between you and your target. So, this is a simple explanation.

Say you and I are having a conversation and there are barriers between it. The most obvious one is we're on the Skype.

Cameron Philipp-Edmonds: Right.

Deborah Kennedy: We're not face-to-face. So, you don't get to see all of my wonderful hand gestures, but there could be others between us. My accent might be distracting to you or I may mispronounce a word that is a particular sore point for you.

And there may be ways that I could know what some of those barriers are, but there's something I will never know. Right? Maybe I look like some ex-girlfriend and you got that at the back of your head or maybe you completely disagree with what I'm talking about and that's in your head. So, as a speaker, my job is to remove as much of those barriers as humanly possible. And understanding cognition and understanding how human beings take in information helps me craft my metrics in a way that removes some of those initial barriers.

So, quick example. I had a graph one time that a staff member handed me. They wanted to turn it over to the client and I immediately handed it back because they had just randomly picked numbers but green and red were on there and they didn't mean good and bad, they just were random.

Cameron Philipp-Edmonds: Right.

Deborah Kennedy: And I said, "You can't do that." Because human cognition says bad and good. They assign meaning to those colors so, you have to go back again. So, that's one simple example, but it's a fascinating article if you ever want to look into it. There's a PDF on the Internet and he really gets into the details of items like that.

Cameron Philipp-Edmonds: Okay. So, those are kind of the small details and that really there's so much more beneath what's actually being presented?

Deborah Kennedy: Amazing amount of the load. Once you get into it and you understand just how the simple little things can throw off your message. It's amazing we ever even manage to talk to each other.

Cameron Philipp-Edmonds: All right. And to kind of wrap things up here, we'll get back in your presentation here. You mentioned earlier is there really one thing you like attendees to take away from your presentation?

Deborah Kennedy: I would say outside of metrics, because a lot of what I do is metrics and process improvement. I would say, in general, we as a QA industry, we have control over where we spend our time. I, as an individual, I have control over where I spend my time. And there is always, always a better way to do anything that we're doing. I mean that in personal life or in work life.

So, I would encourage people to find that and not to be overwhelmed with the vast amount of information that is out there but find one thing, just one part of your job that you find somewhat interesting and learn more about it. Because at the end of the day, I have never found anyone who said, "You know, looking back, I wish I had learned less."

Cameron Philipp-Edmonds: That's great advice. So, and now, to wrap us up with a final question. Is there anything you like to say to the delegates of STARWEST before they come to conference and, of course, before they attend your wonderful presentation?

Deborah Kennedy: Well, I appreciate very much the opportunity to be part of this. I'm really quite excited. Hopefully, there's enough feedback that I can perfect this message and move on.

This is an area that I'm intensely interested in and I look forward to all of the conversations that come after with some sort of nerdy enjoyment of this topic. So, I'm looking forward to it. I'm looking forward to meeting a lot of new people and I hope that my presentation is helpful.

Cameron Philipp-Edmonds: Okay. Well, that wraps up our interview. Once again, this is Deborah Kennedy and she is speaking at STARWEST 2014 and her presentation is titled, "Testers, Use Metrics Wisely or Don't Use Them At All."

Thank you so much for taking the time to talk with us today, Deborah.

Deborah Kennedy: Thank you for having me.



A Client Engagement Manager with Aditi Technologies, Deborah Kennedy leads onshore delivery for a large financial services client. With almost twenty years of experience in core development and testing, Deborah spent much of that time working as a process improvement consultant, specializing in revitalizing and streamlining QA departments as well as bringing more awareness to IT’s role in executive messaging. Deborah has collected a wealth of information about how team members perceive information, the power of metrics in the right hands, and the payoff of understanding the impact of “the small details.”

About the author

Upcoming Events

Sep 22
Oct 13
Apr 27