#069: The Biases of the Analyst

August 15, 2017

Are you biased? Either you answered, “Yes,” or you’re in denial. Or you’re an AI, in which case you should just go and start your own podcast instead of listening to this one. UNLESS your prediction algorithms told you that this would be the episode where we would finally announce the addition of a third co-host, and you need to collect that data point (and, damn, you’re good, BTW). On this episode, though, our THREE (count ‘em!) co-hosts dive into different types of biases that analysts (should) grapple with, how they spot them, and what they do to take advantage of them (or mitigate them, as appropriate).

Schtuff Referenced in this Episode

Episode Transcript

[music]

00:04 Announcer: Welcome to the Digital Analytics Power Hour. Tim, Michael, Moe, and the occasional guest discussing digital analytics issues of the day. Find them on Facebook at facebook.com/analyticshour and their website, analyticshour.io. And now, the Digital Analytics Power Hour.

[music]

00:27 Michael Helbling: Hi, everyone. Welcome to the Digital Analytics Power Hour. This is episode 69. It is said there are thousands of ways to be wrong, and only one way to be right, and we love to talk about those thousand ways. One of the biggest pitfalls for any person using data, including the noble analyst, is falling victim to bias, and I don’t mean cutting cloth diagonally. And I already know how this episode will go, and what happens from here will just prove me right. That’s right, we’re talking about bias. Of course, joining me for this ride are my co-hosts, Tim Wilson, Senior Partner at Analytics Demystified…

01:11 Tim Wilson: I am biased, and I admit it, and I acknowledge that.

01:14 MH: And I am super pleased to say, our newest co-host, also is here with us, Moe Kiss, Product Analyst at The Iconic. Welcome, Moe, officially.

01:28 Moe Kiss: Hey, guys. Thanks.

01:30 TW: She was just a guest co-host last time.

01:32 MH: She was just a guest host before, but now…

01:34 TW: She got a better microphone.

01:37 MK: They were testing me.

01:38 TW: Until she upgraded her microphone, guest only.

[chuckle]

01:41 MH: Yeah, I mean…

01:42 TW: We have a bias towards the quality of our audio.

01:45 MH: Ooh, I don’t think we should say things like that.

[laughter]

01:49 MH: As always, I am Michael Helbling. I lead the analytics practice at Search Discovery. Let’s talk about bias. My personal favorite, because I was recently at a casino playing blackjack, was the gambler’s fallacy. What’s your favorite bias?

[laughter]

02:09 TW: I’ll let our new co-host go first.

02:12 MH: Oh, well. Look at all this hospitality going on here.

02:14 MK: Oh God, look at the pressure.

02:17 TW: Then I’ll talk all over her, and won’t shut up, and won’t make any sense whatsoever.

02:21 MH: All right, so we’re back to normal.

02:22 MK: Yeah, pretty much, pretty much. [laughter] So for me, the one that I probably find the most interesting is… Well, I’m not gonna lie, they’re all interesting in different ways, but the concept of anchoring to me, just…

02:37 TW: That was gonna be mine!

02:38 MK: Aww, too late. I beat you to it.

[laughter]

02:41 TW: I’ll never let you go first again.

[laughter]

02:45 MK: And for me, I think the reason that I like this one so much is because it also gets in that notion of bartering. So where the first price that you said is how you determine value from an item. But of course, it’s really relevant for digital analytics as well. And I’m sure Tim wants to chime in to talk about KPIs once again.

[laughter]

03:08 TW: Well, [chuckle] you’re just gonna let me… Well, to be fair, Moe, since you’ve actually spoken about this, and so we get the benefit of kind of looking at one of your decks, and that definitely on the anchoring… I realize that whole when it comes to setting a KPI target, somebody has to throw out the first number. And I don’t think I’d ever really thought about that as being… I mean I’m familiar with anchoring, but I don’t think I’ve ever really thought about, yeah, when you throw out that first number, you were turning on the anchoring bias immediately. And that means that it’s not just throw out any number, it’s like, you better decide are you throwing out a conservative number, or are you throwing out an aggressive number? Because you’re going to influence the final target if you’re the first person to throw it out. I think. I mean is that…

04:03 MK: Yeah, I’d agree with that. Yeah, absolutely.

04:05 MH: Yeah, what’s so crazy about it, is it’s so natural. In fact, bias just generally seems to be just part of how humans perceive the world, that it’s very hard to step outside of it. So even that one, anchoring bias, is sort of like, “Yeah, we just start biased, and then where do we go from there?”

04:27 MK: Yeah, but I think the thing is… I once said to someone like, “Oh, well this is gonna get rid of your bias.” And then it was like, “Whoa! That’s never gonna happen.” I was very quickly corrected. And it’s true, you’re never gonna get rid of it, but being aware of bias is how you help mitigate for it. So I think it’s about starting with that initial point of like, “Okay, yeah. I do have these biases, and how do I try and reduce the impact of them as much as possible on my analysis, on my KPI settings?” Whatever the discussion is that you’re having?

05:03 TW: And biases are there partly as the result because we have heuristics, because we’re just inclined to take shortcuts. I think that was one of the other points you had made in your presentation, was that we have them because we can’t process everything starting from zero. And so therefore we wind up with biases, is that right?

05:23 MK: Yeah. So basically the concept of heuristics is that your brain tries to help yo, it doesn’t wanna overload things. So it has these shortcuts, which many biases fall into the concept of heuristics, to give you a shortcut to kind of take the load off your brain. It’s to help you make decisions fast with less information. So it’s something that you’re naturally wired to do, but like I said, it’s about awareness, right? The more you talk about it, the more that you call it out in meetings, the more your team is like… Yeah, another bias like last data point, this one can be… It can make you or break you, right? Because if you have a colleague that leans towards this particular heuristic, which is that they tend to prefer, or preference the last piece of information that they heard, if you’re the last person to have a meeting with them, that’s great because they’re gonna listen to you. But then if someone goes and chats to them in the kitchen, there goes your strategy that you thought you had locked in. So they can really make or break what you’re working on.

06:26 MH: We’re working our way through that in terms of who’s our president right now, so it’s a…

[laughter]

06:34 MH: We’ll let you know how that all will works on.

06:37 TW: Wow. [laughter] So does the last data point… See you’re gonna wind up having me saying data instead of data. This is…

06:47 MK: Yeah, for the win!

06:48 TW: This is the things that are gonna happen, it is a cool word.

06:51 MH: I’m not gonna say, “For the win,” though. I’m just putting that out there.

06:54 MK: I am.

06:55 TW: But does…

06:56 MH: FTW.

06:57 TW: Does the last data point… [laughter] I don’t know if this is part of it, but I definitely… Or I think maybe it is. That when we look at a trend, like we look at, “Show me the weekly trend for the last 53 weeks.” And then we look at the, “Did I go up or down last week?” And we have a hard time saying, “Did I go up or down enough for it to matter?” We instantly look at, “Did it go up or down?” And it’s really hard to look back and say, “Well yeah, it went up that same amount 10 weeks ago.” But then it regressed to the mean the following week, I don’t necessarily need to freak out. Does that count as last data point bias? Or not so much?

07:40 MK: The way I think about it is more when it comes to the data, or the explanation for why something happened. And I’m not an expert on this, but this is something that I’m really passionate and interested in. But I guess the way I would think about it is so let’s say we’re talking about last month’s results and why they went down, right? And someone comes up with a hypothesis that it’s like, “Oh, I think it’s such and such.” And then they come up with some data points and then later on in the week someone else is like, “Actually, I’ve got this hypothesis that went down because of this reason.” And then people go, “Oh yeah, it must have been not one.” Because that’s the last thing that they’ve heard. So it’s more about up weighting the last piece of information that you hear than about… Yeah, does that make sense?

08:21 TW: Yeah, yeah. I mean I think I was overly… I understood it that way, and then it was just occurring to me that it was…

08:29 MK: The difference though to anchoring, is that with anchoring, of course, it’s about the first data point, right? But the difference is that with anchoring, that first piece of information, that first piece of data that you see will actually color how you interpret every subsequent data point, right? So it’s like you were saying, if you set the first target number that you throw out, you’re going to perceive everything after that as good or bad in relation to that first anchor point. Which is quite different to last data point, I think.

09:02 TW: Yeah.

09:02 MH: Right.

09:02 TW: I mean anchoring seems so… Where else does that come into play besides… I guess if you were in a campaign and you have… Or if you’re going to run a campaign, and you’re gonna say what is good, it totally matters which previous campaign you’re comparing it to, because that’s gonna be the one… If you had one campaign that had a 2% conversion rate, and one campaign that had a 4% conversion rate, whichever one of those you decide is your reference campaign. When you come in with one with a 3% conversion rate, that’s either gonna look bad or it’s gonna drag you to thinking that’s worse or better, because you’re coloring everything in the future.

09:50 MK: Yeah. I guess for me it’s about talking, because obviously, we’ve just finished the first half of the year and wrapping things up, and when you do that analysis of the first half of the year and you’re all like, “Did we hit our goals? Did we not? Where do we line up against all of our different targets?” For me, I tend to focus more on… Obviously, it’s important if you hit your targets. But I also like to focus on the growth that you see as opposed to like, “Did we increase by… Did we hit this target? But overall, the growth that we saw was this.” And I don’t know, for me, I find that’s a way to kind of be like this number 4.5, whatever, is good or bad. But for me, when I talk to the team about how we’ve done, it’s more about, “We went from 2 to 4.5, and so we doubled.” And doubling is more than doubled, that’s good. Rather than… Yeah.

10:51 TW: Well, you could have worked for web trends back in the day when they always say, “Don’t look at the absolute numbers, it’s all about the trends.”

10:56 MK: Well, it about both, right? It’s about both.

10:56 MH: Yeah, don’t focus on… It’s directional.

11:01 TW: So outside of recognizing anchoring, what do you do? Or is that it? You just have to recognize that whenever you’re judging something, saying, “What was I anchoring to?” What do you do to not overly…

11:16 MK: It can be an analyst tool for good or evil, that’s the way I like to think about it, right?

[laughter]

11:23 MK: So part of our toolbox, right, is how you convince people of things, how you bring people onboard to your recommendations. This is actually a really useful tool for an analyst to help be like, “Okay, well I think that this target is probably legit. The team maybe are a little bit cautious, I think it’s too high, too low.” You can actually use this to help people set targets if you use it the right way. It can be really dangerous, too. But I think that’s true of any analyst tool.

11:51 MH: Yeah. And the idea is these things help us right up until they don’t. And that was… I mean there’s all the books that Michael Lewis has written in the last, however many years, Moneyball, and last year, The Undoing Project, in where Tversky and Kahneman basically explored this, right? With psychology, they were finding out that this isn’t always gonna make you better. Sometimes, your model is wrong, and it will make you wrong. And it’s about exploring the model more holistically. And that’s kind of where my thinking on this kind of is, it’s sort of, okay, so as analysts, how do we possibly guard all the potential bias entry points? And it’s so difficult. So it’s like, do you create a checklist? So, like, “Hey, I’ve done my analysis, now Let me go through my bias checklist. Did I do this biased? Did I do this?” I don’t know if that’s reasonable. I think, for me…

12:50 TW: Why not, though?

12:52 MH: Well, I mean, up to a point it’s just so many of them, and…

12:55 TW: Well, but surely if you hit four or five, do you not… You’re like, “Yeah, I’m fairly covered.”

13:00 MH: Yeah, but then some of them are hard to just check, though, how do you know you’re having anchoring bias?

13:08 MK: That’s the whole point of biases, right? Like, you don’t know that they’re happening. So that’s a part of the process of being an analyst and getting someone to QA your work, and run your assumptions past them, is to try and mitigate for things like this.

13:22 MH: Well, yeah. And I look at it as look for ways to expand your model. Or, look for ways to look at it from a different angle. And actually, other people entering the process is one of the ways I’ve seen work really well, is have somebody else come poke holes in it so they’re not looking at it from your same point of view. And that certainly has helped analysis that I’ve done, because people bring another point of view to it. So that will work. So, I don’t know, I’m just curious what you guys have used to kind of weed bias out of analysis as you’ve gone through the process of doing it?

13:57 TW: Well, I am convinced I am doomed and I have done horrible disservices, ’cause I haven’t been fully conscious about this. So, awkward of show, Moe, [14:06] ____.

[laughter]

14:07 MH: Tim Wilson announces his resignation from the analytics industry.

[laughter]

14:16 MH: Phew! Good thing you joined as a co-host, Moe!

[laughter]

14:19 TW: Because I’m not gonna have much to contribute as I’m basket weaving my…

14:28 MH: That’s right, he’s gonna find himself.

14:29 TW: My relevance is gonna be.

14:31 MH: No, but Tim, it could be that you’ve just been getting it right this entire time, with your biases perfectly lined up to the right answer, and so you’ve never stumbled across it. So that’s more likely what’s going on for you.

14:44 TW: Well, no. I mean I think in other confirmation bias, when you’re heading in… And this was as we were thinking about this episode, I have preached the whole like, “Frame a clear hypothesis. I believe some idea, and if I’m right, this is what I’m gonna do. And here’s how I’m gonna figure it out.” That actually is sort of totally playing to a confirmation bias, because it’s forcing you to state something in a way that if it holds true, you’d know what action you will take. And as analysts, we want to be recommending action. So, it had me thinking that it’s a danger. Now, I think you can be careful, that you’re like, “Well, no. I’m gonna articulate what I would do, and I’m still gonna be diligent about how I’m approaching it.” I’m not gonna say, “Well, I’m gonna find eight ways that that’s not the case, and I’m gonna keep fishing. I’m on a fishing expedition. Torture the data long enough, it’ll say whatever you want.” But I think that is, everything we’re saying, you need to have an idea first. And then go use the data to validate it? Well, that is sort of setting up confirmation bias as being a risk, right?

15:58 MK: Well, one of the ways around that is… And I know that I’m always harping on about analysis of competing hypotheses, but the truth is that same technique can be used to the method that you advocate for, Tim, which is that instead of focusing on evidence to support what you think, you focus on evidence that disproves. And that’s really… Like, those structured analytical techniques are really about reducing the influence of bias on our analysis, and using a really scientific method, which is based on disprove, not proving. So, there are definitely ways, like, confirmation bias, we’ll go, “Okay, here.”

16:35 TW: No, no, no, we’re disproving the null hypothesis, as opposed to… Right? I mean, that’s that language of statistics, which is either, it’s infuriating to a marketer to saying, “Is this true or not?” And it’s like, “Well, I failed to disprove the null hypothesis.” And they’re like, “What the fuck? Could you actually cut out 27 syllables and tell me whether this is way or not?”

16:55 MH: Yeah, “So where I do spend the money?”

16:57 TW: Yeah.

[laughter]

17:00 TW: Well, yeah. But I think… I mean, that actually… A little light bulb just went on. That that is like at the core of statistics, the null hypothesis is, “There is no relationship.” Or, “There is nothing going on here. What you’re looking for does not exist.” And what you’re looking to do is disprove that. So if you come in with a hypothesis that there’s something really interesting going on, but you say, “The way that I’m gonna approach is actually, because I figured out a statistical method to use it,” really, what I’m doing is trying to see if I can show, my null hypothesis is that there isn’t a relationship, there isn’t anything going on there. And let me see if I can disprove the null hypothesis, right? I think.

17:56 MK: Yeah. Yeah, well I mean ultimately, it’s that old adage of like, you can have a thousand pieces of evidence to support something, but if you have one piece of evidence that disproves it, then you can get rid of that hypothesis, because it can’t possibly be true, right? So, and it’s funny we were talking…

18:13 TW: Wait, so you’re recommending that every analyst does one analysis in their career, because they have to do 1,001 different tests? No. [chuckle]

18:23 MK: I was about to go into the fact that there needs to be some level of moderation as well, like you can’t look at everything under the sun to disprove something, and it depends obviously on the size of the decision that you’re making based on this piece of work. But ultimately, when we sit down and we go, “I wanna prove this hypothesis and here are the five data points that will support me.” Yeah, you are doing a bit of confirmation bias, right? ‘Cause you’re setting out to find those five data points that will prove what you already think. Whereas if you sit down and go, “Okay, if these five data points show this, then I know that my hypothesis can’t be true.” Then, like I personally would feel more comfortable with those results, but there is a level of you can’t take it to crazy “I’m gonna keep searching for everything.” It’s about tentatively accepting what might be true, not when you get to that stage of “I can’t find something to disprove my hypothesis.”

19:24 MH: Well, and this is sort of where that dividing line between academia and sort of business sort of starts to feel real to me, because we don’t have lab-like environments to do analysis in if you’re not working in the rigors of academia, when you’re actually doing research for research’s sake, and adding to the overall body of knowledge as opposed to trying to figure out what the right next thing to do with this marketing campaign is, or whatever. And I feel like that’s where you kind of have to put your hard hat on and recognize, “Hey, yeah, I feel like there’s some errors in, or bias in how we’re driving this, but how do we become more aware or trim those off as we go?” But I also think… What do you guys think about the fact that inherently there’s bias in all of this, because the mindset of how we even operate businesses in the digital realm is filled to the brim with pre-conceived notions, and what worked over here will work over here, and all these kinds of things.

20:36 TW: Yes, I think… A little… Another thought I was having as you were… A way to combat, and I think it applies in both the combating confirmation bias as well as the, “Hey, we’re just rife with biases and how we think about digital.” Is that piece of… I always think about when I’m doing an analysis, I never wanna find one… I don’t wanna say, “If X, then Y.” I generally wanna say, “Well, if X, then Y. And if A, then Y, and if B, then Y.” Like a triangulating… That’s a poor way of putting it. If somebody says, “Which channel’s working best for us?” I don’t just say, “Well, I’m gonna pick one metric and look at one time period and then make that case.” I’m gonna try to step back and say, “If you’re asking which channel’s working best for us, well you know what? I’m gonna look at ROAS and I’m gonna look at total traffic, and maybe I’m gonna look at cost per acquisition. And I’m gonna look at a couple of different time periods to see if in different seasons.” And I’m gonna try to come up with, this is the set of things that chances are, all of them are not gonna point to the same channel, and the same answer. But because we’re in that messy world, if I say, “I’m gonna give a qualified answer.” But what I’m gonna try to do is never say, “I think paid search is my best channel, let me go find the first data point. I’m gonna look at one data point and and see if it confirms that, and then be done.”

22:12 TW: I’m actually thinking back to somebody had done that. So a client, one of the business stakeholders… I was getting kind of the cryptic questions where she wasn’t my main contact, she was copying a couple of other people, and she was like, “Can we get this piece of information out of Adobe?” And I’m like, “Yeah, sure.” It was like new versus returning visitors, or something. And then it was like, “Okay, can you look at this spreadsheet I made and tell me that that the conclusion I’m drawing is correct?” And what I wound up having to do was say, “Yes, the data you were looking at is accurate, however, these are like two or three other things that you would wanna look at, and when I look at those it’s a little bit murkier. You want a black and white answer.” And it’s the challenge of the analyst, like every time I ask the analyst for, or every time I ask the statistician for the truth, they give me a confidence interval.

[chuckle]

23:06 MH: Which actually goes to… We’ll put in our first Matt Gershoff reference, where that’s where he will kind of rail about a flaw in the way the digital analysts want to behave, they want to answer the questions like marketers ask them, which is yes or no. And instead, you’re actually, you’re living in the realm of uncertainty, and there’s a cost to reducing uncertainty, you’re never gonna eliminate uncertainty. And when can you go forward? And bias kind of plays into that. Where do I say, “I’ve done enough looking at the data that I have offset my biases, and have enough confidence.” There’re still risks that I’m wrong, but there’s really no absolute right and wrong.

23:52 MK: This is…

23:52 TW: And I will step down from my soap box, and I’ll depart. I’m done, drop the mic.

23:58 MK: That was a really interesting line of communication.

[laughter]

24:05 TW: Conversation implies two people talking, that was just me. Just like a rambling monologue.

24:11 MH: Ï think Moe recognized you were arguing with yourself.

[laughter]

24:17 TW: Does that count as conversation? [chuckle] “That was a really passionate argument you had with yourself.”

24:21 MH: I believe that’s called cognitive dissonance, right? Which is when things go against what our bias would seem to indicate.

24:32 MK: Oh God, we’re full of all the bias today. I was just gonna say the way that you describe those two approaches are very different, and I think that’s really important when you start talking about bias. There is a big difference between starting out to be like, “I believe that paid search is performing the best.” And then going about to prove that. Versus, “I wanna understand which channel is performing the best.” So even just that starting point of the question and how you frame the problem will help determine how much bias is involved in your analysis. So I just wanted to draw attention to those really two big differences, in just starting points.

25:19 TW: Yeah. Yeah, the which channel, that is the million dollar question, “What’s my good channel? Where should I be shifting money?” And like, “It’s easy, shift money into direct.” And they’re like, “Oh. Wait, I can’t.” Like, “Well, yeah. Yeah.”

25:34 MH: But that’s where what I’m referring to before, which is the bias is inherent in the system in a certain sense, as if this idea of there being a best channel, or even a best group channels, or a best combination of sequence of channels is the right answer for all marketing dollars. Where’d that idea even come from, right? And what do you do to get around that? Anyways, well that’s a whole other episode or two about attribution that we probably did with Kim Novo, so…

26:06 TW: Well, but doesn’t that get you to… This whole discussion is that there’s two hurdles, if the analyst becomes better at recognizing biases, and therefore, introduces some more murkiness and difficulty into the process of analysis, the fact is they still have a stakeholder, a Project Manager or a Marketer who isn’t necessarily saying, “Yes, let’s talk about our biases. Let me engage in that.” So it does make the analyst job harder at two levels, one, we need to recognize it, two, we need to figure out how to continue to have a positive relationship with our stakeholders. And they don’t necessarily wanna hear that they’ve got the confirmation bias problem they’re running into.

26:57 MK: Yeah. I was actually talking about this with the Head of the Finance Team at The Iconic recently, and it was a really interesting thing, because most listeners know by now, I’m much newer to the industry than my two co-hosts, and so sometimes there are problems that I haven’t encountered yet. And this is one that I was kind of… It was a Friday night, and as per usual, all the analysts from The Iconic are sitting in a corner, talking about nerdy stuff. And I couldn’t get around the fact that I was like, “Other analysts seem to just say like, ‘This is right, or this isn’t right.’ And for me, I’m always like there’s always a level of uncertainty, there’s all these things in play, when do you call it and be like, ‘I’m confident of my assessment’? Or where do put caveats on what you’re sharing?” And the way that he explained it to me, that kind of summarized it and really just clicked was like, “Moe, you’re an analyst. You’re never gonna be a 100% sure, you’re always gonna biases, you’re always gonna have incomplete data, or there’s gonna be things that are outside of your control. But do you feel comfortable making decisions off this? If you do, share that data point and then just put some footnotes.” Because yes, Tim, we still use memos to share our results.

[laughter]

28:16 MK: But basically he’s just like, “Caveat, and say, ‘These are the concerns I have.’ But the truth is, most people are never gonna go down to read that footnote about like, ‘Oh, there might evidence of confirmation bias,’ or like, ‘this bit of data is incomplete.’ But it gets you to a point where you can share your data comfortably and be like, ‘Yep, I stand behind this, 80% certainty.’ And then, if you don’t feel that comfort level, then you don’t share it, you say, ‘I’m not comfortable with the numbers.'” And that kind of, “Take this approach or that approach.” I don’t know, it just clicked for me that that was a way to… I don’t know, it was a new lesson for me that I thought I would share.

28:53 MH: This is sort of the way I’ve thought about it, I’ve observed it’s been in more in the world of science, I’ve seen people with a lot of certainty about how they see, how this aspect of the world works. But what I’ve also observed is the most notable scientists, and the biggest brains, and the people who’ve accomplished the most in the world have this amazing sense of awe and curiosity that allows them to be open to the idea that this could all be different. And I often describe it as humility, but it’s actually like… I don’t wanna say it this way because it sounds unsophisticated, but almost a childlikeness when they get to talking about the things that they’ve discovered, and the world that they have understood. And it’s like nobody know more than they do about this, and yet they’re just blown away by the potential of what else there is to learn.

29:49 MH: And I think that’s been sort of my guiding light around bias, which is how do we adopt that kind of a mentality? So no matter what I know about analytics or analysts, or what should be true about how a website works, or how to analyze or do analysis, how do I leave myself open to the idea that maybe I haven’t even explored far enough to see that next big horizon? And so that’s kind been my way to try to look for a chance to sort of step back and be like, hey, I don’t need to approach this from a dogmatic standpoint of, “Yeah, of course, I know X, Y, and Z is always true, so all my analysis drives straight down the middle of that.” And instead step back and be like, “What if I’m totally wrong about X, Y, and Z? What does it mean now?” And that being kind of a principal that kind of allows me to look with fresh eyes at stuff. And I don’t know, that’s sort of just been my way of trying to address that. And also because I like science.

[chuckle]

30:47 TW: Is it fair to… And I don’t know if this is another side of the same coin, but it’s just recognizing that there is no truth. So this is good enough for me to make a decision and then make a recommendation, however, I would love to make this decision, but also test it. Or make this decision, but also pursue this other thing. Or I’d recommend this decision, unless anybody in the room has a counter explanation that means I should go do some more analysis. I mean there’s always the risk of analysis paralysis, right? I mean it’s cliche, but I almost never… Actually, I probably would say I have never drawn a conclusion without… If I’m ever presenting to a group when I have done the analysis, I don’t think I ever say, “And this is the way it is.” I’ll be like, “Hey, this is really surprising. I came at it four or five different ways, and I have yet to be able to find an explanation. Maybe we don’t need it because I’ve just found something amazing and this is just rocking our world. Or more likely, I’m still missing something. Any of you guys have any ideas?” And eight times out of 10 somebody says, “Well, what about this or that?”

32:04 TW: And there are times where they have a really good… It sounds like a really good story, but you’re like, “That’s total bullshit.” Like, “There’s no way that’s it, but let me go look at it.” I mean we had a case where basically a paid channel had been split into two channels, but looking at year over year growth, the one channel was doing terribly and they’re like, “Oh, yeah, but if you combine the two channels, we don’t have year over year comparisons.” And I was like, “Well, back of the napkin, let me just go do the math.” Like, “Nope, if you combine them back together, it’s still not doing what you expected it was doing.” But it was a totally valid theory. And I wonder how much that is… Different people have different biases, they have different shortcuts. Definitely on the confirmation bias you’ve got competing… Not agendas, it’s a little too harsh of a word, but competing shortcuts. So, it’s great, solicit from them, “Well, what do you think might be going on here?” And kind of slowly circle the drain until you say, “We’ve kind of come out at it four or five different ways.”

33:04 MH: So, do the analysis five times, and use memos.

33:09 TW: Oh, I thought you said the other. That is like the…

33:12 MH: I’m writing down the tips.

33:13 TW: The debate between whether to put footnotes on slides or not. And I am actually… I fall in the camp of “Yes, putting a data source on the footnote or in the footer.” Like, data source, where did it come from? What was the date range? And, also if there are footnotes, if there is a caveat, put it there ’cause you don’t know who’s gonna be looking at it. And nobody’s gonna read it, but it’s gonna be a reminder for you when you’re looking back at it.

33:42 MH: Right.

33:43 TW: And somebody who really thinks, “Wait, what’s really going on here?” They’re gonna look at the footnotes as well. So, maybe not in these memo things you talk of. Is a memo like… Is that like a meme, but from the 1980s?

34:00 MK: Oh, geez. Oh, geez.

34:01 MH: An animated memo.

34:02 MK: Yeah, I still have mixed feelings on the old memo. It has a time and a place. But yeah, Tim, I agree. If you have, I guess, caveats or things that you need to clarify about the data, I would 100%, and even before I joined the memo squad, I would put them at the bottom of the slide, so…

34:22 MH: Yeah, I mean I’ll just jump out there and say I’m not convinced that PowerPoint’s the best way to share recommendations or insights from analysis anyway.

34:30 MK: Ooh.

34:31 TW: As a… Really?

34:33 MH: Yeah. I think it’s a standard packaging because we… I mean, Tim, you and I are consultants, so we just sort of default…

34:41 TW: I’ve actually been using memes for the last two and a half years, so that’s…

[laughter]

34:45 MH: Those are actually good, too.

34:47 TW: I find myself…

34:49 MH: Hey, you’re in great company, ’cause I mean Christopher Berry, I think, has introduced many in our industry to memes, even if they didn’t realize it.

35:01 MK: Oh, gosh.

[chuckle]

35:03 MH: Well, so what do you think is a… Is there a means of delivery? Or it’s just totally situational?

35:11 TW: Hey, I’m very biased, so…

35:14 MK: I think people just don’t do it well, some people don’t do that well, and so it gives presentations a bad name.

35:20 TW: Yeah.

35:21 MH: That’s, I think, you just have said it. Because if you go back… Are you guys familiar with the old Proctor and Gamble one page memo?

35:29 TW: Yeah, yeah.

35:30 MH: So, there’s a file.

[overlapping conversation]

35:33 MH: Yeah, this is before your time.

35:35 TW: I was too young and too American.

35:36 MK: I’ve done lots of memo research, rest assured.

35:39 MH: Okay. Okay, but what that does… And actually, I think Amazon uses something like this, I don’t know if it’s the same thing…

35:46 MK: They 100% do use memos.

35:49 MH: Yeah, I read that somewhere as well. It forces you to organize yourself around your ideas in a more better way, I think, to kind of put your best argument forward. And actually, that’s the problem with most of the PowerPoint delivery, it just lets your story meander everywhere, unless you’re really focused about what you’re trying to say. And maybe that’s my biggest problem with it.

36:12 TW: Yeah, there was a guy… Yeah, the people who are like, “I just don’t do presentations with PowerPoint.” And there was a guy, and like, “Well, you just the threw out the baby with bath water on that one.” But…

36:23 MH: Well, and to be fair, Tim, I’ve never seen you present an analysis as PowerPoint. So I guess I should open my mind to the possibility that there’s a whole other horizon out there, it sounds like something I…

[laughter]

36:36 TW: Actually, I would say when I worked at Clearhead, that was one of things that from day one… And the founder was very, very… And we’re diverging way off the path. It was almost a memo-like structure to “This is how we’re going to present test results.” And it was very concise, and it was very structured, and it would be hard at times where he said, “In the abstract, this is the right way to do it. So we are going to apply that framework rigorously with how we… ” And it works to this day. So that was in Keynote, I guess, technically. So it wasn’t PowerPoint.

37:13 MH: Yeah, I like to use Prezi, where you flip it around the a bunch of times.

37:16 TW: Oh, God, no. I’m so fucking done with Prezi.

37:20 MH: You used to like Prezi!

37:22 TW: I did. I did Prezi at The Iconic, I did it well. There were many pictures of Moe that was…

37:28 MK: Was that Prezi? I thought that was Keynote or something.

37:33 TW: No, that was totally Prezi. And you know what? I’ll put the link in the show notes, if you’d like to see pictures of Moe from her childhood that her sister gave me. I’m pretty sure that was in Prezi.

37:45 MK: Oh, dear.

37:46 TW: Yeah, so…

37:48 MK: What I would like to talk about, though, is I feel like I sold myself short, because I started with my favorite type of bias, which was anchoring. But the truth is, the one that I actually love is heuristic, it’s called escalation of commitment. And economists basically refer to it as sunk cost fallacy, which is where you continue to justify spending money, time, effort based on your cumulative prior investment. When the truth is, if you started from scratch again, it would show that you shouldn’t continue down that path. And the reason that I think this one is worth us having a chat about, is because it’s really relevant for product, and obviously I’m a product analyst. But even when you part way through a marketing campaign, or a strategy, and you see that something’s not working, you’re not hitting your targets, but you kind of just keep going down that road.

38:43 MH: Yeah, that fallacy has lost me money playing poker.

[laughter]

38:46 MH: So, I mean that bias. It’s like, “Well, surely I can’t get out this hand now, because I’ve already put so much money into it.”

38:58 TW: Well, yeah, and I think there’s a piece of it’s admitting that it didn’t work, and if there has been significant budget invested in it, then the person who owns that budget is having to… On the one in the abstract, and actually in reality, if they would say, “We really thought this was gonna work, we had good rationale for it, we had good reason. It’s not working. If there’s something working better, we should shift over to that.” And that is just so not human nature to say, “This thing that I spend a million dollars on is an absolute dog.” And then you start using anchoring and saying, “Oh, but if we can make two bases points better, it’s improving.” So you’re like, it started as shit, you tried to ignore the initial target, reset a new anchor of what it’s done for the last six months, and then just continue, drives me berserk.

39:57 MH: Yeah.

39:58 MK: I think with product strategy, that happens a lot, right? Because you’re building a new feature, and you think this new feature is gonna be amazing. And I think developer resources are so in demand, so it’s not even the money, it’s like, “We’ve already spent a hundred hours of developer time building this, so we don’t wanna go back.”

40:20 TW: Yeah, and I don’t know if you’d really count that, but that hundred hours does equate to money. I mean I know you’re newer…

[laughter]

40:29 MH: Oh, boy.

40:29 MK: Oh, shit.

40:31 TW: No, no, that was totally a fair point. If your rockstar developer just spend a hundred hours on something, yeah. Carry on.

40:40 MK: Yeah, yeah, yeah. And I mean we do look at the economic cost of their hourly rate, and blah, blah, blah. But the point is you get attached to it, and I think one of the ways to really overcome this one, because it happens all the time, is… And I hate to give Tim some accolades here, but if you come up with what action you’re gonna take…

41:03 MH: I understand that feeling.

41:04 MK: Before you start, like, “If we are successful, we’re gonna do this. If we fail, we’re gonna do this.” And you have to stick to it, because as the data starts coming in, everyone wants the feature, or the campaign, or whatever it is do well, and you start being like, “Oh well, we said 5%, but 4% is almost there. So, we’ll just run it for another month and see.” And people want it to succeed, which I think is why it’s so important to decide before you start off, like, “What are we gonna do if we don’t hit 5%? What are we gonna do if we do hit 5%? What action are we gonna take?”

41:39 TW: Yeah, and there’s a lot of optimism when you’re starting off, right? That, “We’re terrible, we think everything’s gonna work out perfectly, we think we perfectly understand the consumer. And then if we add these features, this is exactly what will happen.” And we don’t wanna talk about like, “Well, what if we’re wrong?” So it’s hard to get that discussion.

41:57 MH: Well, and I hate to say this now, because we’re about to have to wrap up, but there’s actually a whole aspect of bias that we didn’t even talk about, which is how analysts break through the bias of the organization that they’re in. So, status quo bias, “Well, any new ideas you bring are gonna be worse than what we’re already doing, so we’re just gonna reject them out of hand.” Or those kinds of things. So, next time on the Power Hour…

[laughter]

42:23 MH: No. [laughter]

42:25 TW: This is gonna be the only theme, this is just the next topic for the next 12 months. We haven’t hit Simpson’s Paradox, we haven’t talked about confounding variables.

42:35 MH: Well, but it’s a shift, but actually probably pretty important, because I think we all face bias, too. So we have it in ourselves, and we also face it in the organizations we’re trying to impact and change. So it’s just occurring to me now that we didn’t really talk about that aspect of it.

[laughter]

42:55 MH: But, maybe in the future episode?

42:57 MK: So one thing I’d like to share with you though, this guy Didier, I heard him speak recently. And he’s the CEO of a company called Culture Amp. And he was doing this presentation, and he said… So he has four co-founders for his start-up, and basically they sat down one day and they were, I don’t know, drinking red wine, some typical middle class white male thing, chatting about how they were going as founders and at some point, one of them was like, “Oh my God, we are all white middle class guys, and we’ve all been to really good universities. We are biased in our thinking about how we approach stuff, right?”

43:35 MK: And he talked about this in his presentation, and someone asked him like, “Well, what are you gonna do about it?” And I really liked his response, which was, “I don’t know. But the first step to acknowledging your biases are to call them out, and I’m calling out that’s the way that we view the world because that’s who we are, and that’s okay.” But the first way to overcome it is to be aware of, I guess, whatever the bias is, whether it’s to do with escalation of commitment, whether it’s the anchoring, it’s all about being aware of it.

44:05 TW: That’s the male, pale and stale bias, I believe you were discussing before we started recording.

[laughter]

44:11 MK: That was a direct quote, I haven’t said that myself.

[laughter]

44:18 MH: Yeah, the temperature’s getting really warm right now.

[laughter]

44:23 MH: And let’s not just go calling out biases that we might personally have at this point.

[laughter]

44:31 MH: Actually, it would be great if we could keep talking about this, Moe and Tim, but unfortunately, we’ve gotta start to wrap up.

[chuckle]

44:41 TW: That’s your time keeping bias.

44:44 MH: The time keeping, exactly.

44:46 TW: The name of the podcast has hour in it bias.

44:50 MH: So, [chuckle] anyway. As we go into last calls, before that, Tim, you had mentioned something, do you wanna talk about eMetrics New York really quick?

45:01 TW: Yeah. So it is mid-August, and coming up at the end of October is eMetrics New York. And a couple of things… Well, Michael and I are both speaking there, and not together. Our counselor said that if we were… Based on restraining orders and other things, we’re not allowed to actually be within 50 feet of each other, and they couldn’t make the AV work. But despite that, did a little exercise before we recorded to see of the speakers at eMetrics New York this year, how many of them are past guests on the Digital Analytics Power Hour. And Jim Stern, of course. John Levitt was in the early episode about data governance. And David McBride talked about Internet of things on the podcast, he is now with IBM doing Watson-type stuff. Brett Hurt from data.world, former founder of Core Metrics and Bizarre Voice, and we had an awesome chat with him about data.world. And then the perennial already mentioned on this episode once, but Mr. Matt Gershoff from Conductrics. So, all of those people, if you’ve listened to any of those episodes and thought, “Man, those are some thought-provoking people.” They will all be at eMetrics New York, as well as Michael and me. And…

46:17 MH: And Moe!

[laughter]

46:18 MH: I’m just kidding.

46:19 TW: Of course there will be Moe as the new… But there will be appearances in the future where we’ll all be there, we’re confident. But on top of that, and this was not eMetrics, this is not a sponsorship, but we do actually have a discount for any listener. If you’re going to eMetrics New York city, contact us through any of the channels, listen to the outro, listen to Michael’s wrap up, reach out to any of us, we’ve got a discount code if you’re interested in going to that, we’d love to see you guys there, and we’re looking forward to seeing some of our past guests. So, that was a very wordy pre-last call, but I’m looking forward to it, looks like some good sessions.

46:56 MH: Yeah. And so, we’ve officially arrived at the last call. It’s the time of the show we like to go around the horn, talk about something we’ve found that’s interesting. Moe, you’re our newest co-host, so we’ll give you the first last call.

47:10 MK: I get all the firsts tonight.

47:12 MH: What do you got?

47:13 MK: So I wanna do a shout out to Catherine Hackney…

47:16 TW: That was gonna be mine! Oh, wait. I guess I had to wait till you said it… [chuckle]

47:20 MK: Every time, every time! [laughter]

47:23 MH: Every time. It never gets old. It never gets old. Tim has same joke bias.

[laughter]

47:31 MK: Yeah, he does. He does.

[laughter]

47:33 TW: It’s “is this horse dead yet” bias.

[laughter]

47:38 MK: Oh, God.

47:38 TW: Carry on, though.

[overlapping conversation]

47:42 MK: So, Catherine from the DAA has been doing a really amazing job. She’s been supported by lots of other women in the community to set up the women in analytics group in DAA. And they do also have a mentoring program, but unfortunately the pilot’s closed. But if you’re interested in participating the future, definitely reaching out to her. And on a personal note, I think this movement that’s going on is absolutely incredible, and I’m really proud of it. And big thanks to Catherine, DAA and all the other women that have been involved so far.

48:13 TW: Absolutely.

48:14 MH: Awesome.

48:14 TW: Hear, hear. She’s awesome.

48:16 MH: All right, well I’ll go, Tim, so you can have the last word bias.

[laughter]

48:23 MH: No. [chuckle] Okay, so kind of a twofer, but really quick because it’s on a topic of bias. I was listing to the Hidden Brain Podcast recently, and it was back an episode or two ago, and they were talking about robots. And they had this…

[overlapping conversation]

48:40 MH: Amazing kind of observation, which is all of the genius robots are voiced by guys, like Watson, and Hal, and all of the helper ones are voiced by women’s voices, like Alexa and Siri. And it was just sort of like, “Whoa! What?” And you’re kind of like, “That’s pretty crazy, but it kind of shows that there’s maybe something there.” Anyways, really great episode of the podcast, there’s much more to that episode in the Hidden Brain Podcast.

49:12 TW: That’s the one where it was, even a robot, you won’t smash it with a hammer if you give it a name?

49:17 MH: Yes, yes. The guest was a woman from the MIT Labs, or whatever.

49:23 TW: The Media Lab, MIT Media Lab?

49:25 MH: Yeah. And just fascinating, really great. A really great podcast episode. Okay, but my actual one is I recently came across a Coursera specialization that I think a lot of our listeners would be interested in. As you’re aware, Google has their cloud platform which covers a lot of data manipulation and data engineering tools and they have launched through their team a Google cloud platform specialization through Coursera. So if you’re working with a lot of data, there’s a lot of really cool tools on the Google cloud platform. I know my team has been digging into and really enjoying working with that. So check that out as well. I’m not being paid by Google to talk about it, and will certainly give a shoutout to Adobe when they launch their cloud platform specialization on Coursera.

[laughter]

50:17 MH: That cover all the bases there?

50:18 TW: Nicely done.

50:18 MH: Okay. Over to you, Tim.

50:21 TW: Oh, crap. So I’m gonna do a twofer. So my original one that I was going to do was a podcast I heard several months ago, and it’s not one that I listen to regularly. And I feel like I do need to call out that I do not listen to podcasts while I’m working. There are people like, “I just can’t listen while I’m working.” I’m like, “Well, no I can’t either.” But apparently I do that in lieu of actually interacting with my wife or children.

[laughter]

50:42 TW: But this is one of those that just popped up. So the Tectonic Podcast, which is from the Financial Times, and there was an episode… I wanna say it was back in June. But it was called the Graphics Chip Powering AI Technology, and it was Jensen Huang, the CEO of nVidia. And what was interesting was he was basically talking about GPUs, so if you are kind of looking at the machine learning world and deep learning and AI, there’s all this, like it’s not CPUs, it’s GPUs. And it’s weird, it had struck me as weird as saying, “Why is it this graphics processing… I get it that graphics are complicated, by why is that what’s being applied.” That’s what’s needed. And he does… It’s not that long, it’s like a 30-minute or less, 23-minute episode. But he does a really good job of sort of talking through sort of why GPUs, and that technology is what is really powering all of the AI stuff. So that was my original last call.

51:45 TW: I feel like I have to throw in now, because we’ve talked science and studies, a newer podcast which is Live from the Poundstone Institute. Have you heard this one? If you are a listener to Wait, Wait, Don’t Tell me, Paula Poundstone is the clear rockstar when she is a guest on that, because she always is like, “Where did that study come from?” And somebody at NPR was like, “We’re gonna take… ” I think Adam Felber, I think that’s who it is, and with Paula Poundstone, and they are just going to do basically a live audience, the Poundstone, from the Poundstone Institute. And it is genius, because it’s Paula Poundstone. So I’m sneaking in a twofer there as well. That’s all I’ve got.

52:30 MH: Yeah, these are all big names in Australia too, right, Moe?

52:32 MK: Oh, completely.

52:33 TW: Paula Poundstone? Is she not big in Australia?

52:36 MH: No, I vaguely know that name. I just think it’s funny because it’s like these NPR Podcasts, do they have an international?

52:43 MK: Yes, yes. I do listen to NPR.

52:44 MH: Oh, good.

52:46 MK: Yeah.

[laughter]

52:50 MH: Anyway, guess who listens to this show? All of you. And you’ve been listening and you probably have a thing or two to say to us about bias, because we probably didn’t do it the justice it deserves, even though we try. But that’s just our bias. But we’d love to hear from you. Reach out to us through Measure Slack, through our Facebook page, Twitter, all those channels. We’d love to hear from you, love to get into a conversation about it as we, as you go through your analysts lives. Moe, welcome to the show, it’s great to have you as our co-host. And looking forward to all the cool stuff we will do together on the show going forward. And, Tim, as always it’s a pleasure. For my two co-hosts, I am Michael Helbling saying keep analyzing.

[music]

53:46 Announcer: Thanks for listening, and don’t forget to join the conversation on Facebook, Twitter or Measure Slack group. We welcome your comments and questions. Visit us on the web at analyticshour.io, facebook.com/analyticshour, or @AnalyticsHour on Twitter.

[music]

54:05 Charles Barkley: So smart guys want to fit in, so they’ve made up a term called analytics. Analytics don’t work.

[music]

54:14 MH: Oh, Christ.

[laughter]

54:17 MK: Oh, shit.

[laughter]

54:23 TW: I don’t care, I don’t care if we quadrupled our revenue, it sucked.

54:28 MK: Aww, someone said this to me tonight, “Male, pale and stale.”

54:34 TW: K Willy?

54:35 MH: Yeah, K Willy.

54:37 MK: That’s pretty funny.

54:39 TW: I’m like, “Ooh! Ooh! Ooh!”

54:43 MK: I just spoke to a few people that were just starting out in analytics, and they’re learning R, they’re learning Python, they’re like, “Oh, should I pick up SQL as well?” And I’m like, “Maybe focus one.”

54:54 MH: I just… My team does random shit, and I’m like, “Yay, keep going. Great job.”

[laughter]

55:00 TW: We can edit that out, if need be.

55:01 MH: Okay.

[laughter]

55:07 TW: Rock, flag and biases.

[music]

Leave a Reply

Your email address will not be published. Required fields are marked *

One comment on “#069: The Biases of the Analyst

The Digital Analytics Power Hour © 2017