#076: Insights, Please. Actionable Ones! With Rod Jacka.

When you find a true insight, it can make your head spin. But, will your head spin in a different direction if the insight is found in Australia than if it is found in the United States? On this episode, Rod Jacka from Panalysis joins the crew for a balanced discussion (northern AND southern hemispheres) about how the phrase “actionable insights” should turn the stomach of any right-thinking analyst. More importantly, the gang discusses the need for clarity around insights — both definitionally and expectations-wise — and share their favorite techniques for getting that clarity.

Show Links

Episode Transcript

[music]

00:00 Michael Helbling: Welcome to the Digital Analytics Power Hour. This is Episode 76. A lot of us think that the first thing that happens on a movie set is the director yelling, “Action.” But actually, there are a plethora of activities that have to happen before anyone can yell that word. In our industry, there’s a similar misconception, where all we need to do is get the right tools and amazing insights just start flowing out of them. Well, cameras up, roll sound, Power Hour, scene one, take one, set Actionable Insights. We are ready for another close-up on what makes the analytics world go around, insights that make businesses decide and/or do things to improve. It’s the Holy Grail and yet so few of us drink from its life-giving waters. How’s that for some mixing of metaphors? As always, I’m joined on my quest by my co-hosts, Tim Wilson and Moe Kiss. How you going, you two?

01:05 Moe Kiss: Great.

01:06 Tim Wilson: Where’s up?

[laughter]

01:10 MK: Make up stuff now.

01:11 MH: Yeah. Wait, that’s not an official… That’s not an…

01:14 TW: That’s not Australian? Am I looking at the wrong website?

01:16 MH: I felt like that was pretty close. [laughter] Well, I’m Michael Helbling, just trying to get my Aussie greetings in order and apparently failing. But we’ve thought about who could make this conversation more scintillating and scrumptious. Well, our guest is none other than Rod Jacka. Rod is the Managing Director of Panalysis. In his spare time, he’s an active member of the analytics community in Sydney. He’s also the longest running sponsor of their Web Analytics Wednesday. He’s someone who’s been doing this for a very long time and we are excited to welcome you to the show, Rod.

01:53 Rod Jacka: Thank you very much for having me, Michael.

01:55 MK: At least now we can, we’ve got an even playing field of country to country. I’ve got some moral support today.

02:02 MH: Yeah, 2V2. Whether we’re playing… Are we playing Aussie Rules? Or are we playing American football?

[laughter]

02:08 RJ: Let’s mix it up and see where we go to.

02:11 MH: Yeah, that’s right. This little scrum that we call a podcast. Alright, let’s get things kicked off and maybe just start out with, what are we even talking about when we’re talking about an insight or an actionable insight? And who wants to take that and get us going?

02:30 RJ: Well, I can possibly jump in.

02:33 MH: Well, I should hope so.

[laughter]

02:38 TW: That was a test.

02:38 MH: That’s right, he gets it. He gets why he’s here. Good. Yeah, Rod. [laughter] Get it, go for it, sure.

02:45 RJ: It’s a very difficult thing because nailing that down, it has different meanings to different people, and, from what I’ve seen, in what we say as analysts it can mean anything from an interesting story to some collection of facts and figures. But from my perspective, I quite like the definition that Gary Klein put forward in his book, Seeing What Others Don’t: The Remarkable Ways We Gain Insights, which is that an insight is an unexpected shift in the way that we understand things. And he talks about this as the notion of discontinuous discoveries. Though a true insight, to me, is something that opens up a different world. It changes our understanding of something in a very deep way. And that’s a difficult thing to do because that requires a number of moving parts, a number of thought processes to get to do that. I don’t think, as analysts, we do enough of it, because A, it’s hard and B, it’s very time-consuming. But I’d be interested in your thoughts.

03:44 TW: I like that definition from the fact that it defines an insight, but doesn’t necessarily say that you’re gonna take action from it. So it actually isn’t one where actionable insight would be redundant, although I do tend to have a gag reflex when I hear actionable insight, because I feel like it gets tossed around so casually. But I wonder, when you have that fundamental shift, does that mean that as a typical analyst at a typical company doing a typical job? Should their expectation be that they might have one to three true insights a year? Or, I guess, when it’s defined that fundamentally is literally, it’s shifted my understanding of how something, as opposed to confirming how I suspected it worked, but giving me more confidence. It seems like trying to cast back and say, “How many insights have I found that really clear that bar?” I think it’s a great definition. It just means we need to stop using that word a lot. We need to just put a dollar in the swear jar for saying, “insight”. But it also means marketers can’t be asking for, they need their weekly insights either.

05:03 RJ: Yeah.

05:04 MK: I was about to say, when it comes to weekly or monthly reporting and it’s like, “What are your monthly insights?” And you’re like, “Well, I can tell you about our monthly performance.” That doesn’t necessarily mean there’s gonna be anything groundbreaking in there.

05:17 MH: I do consider it one of my greatest achievements in my agency career to remove the word “insights” from the dashboards we delivered and say, “Here’s the dashboard. There are no insights in this dashboard. However, our analysis may result in insights that will come in a different document.” Because anything you’re putting in the dashboard is just gonna be observations on performance and questions perhaps, or ideas, but not necessarily full-blown insights.

05:47 TW: I like to say you really can’t schedule insights. You can go looking for them but there’s no guarantee that you’re gonna find them on any given day. I feel like this is vile on agreement that’s gonna be really boring.

06:01 RJ: We could make an analogy, we could take, if you’re in a some form of enforcement career, say police or parking inspectors, that you have a quota perhaps you have to meet at your fines that you need to issue. Well, if you have a quota or something like that, you then have to find the necessary things to meet your quota. Now, that means you may take a very harsh line on what might be minor indiscretions. If we take a notion of having a KPI around insights that we need to generate them on demand every month, then that very well may lead to fairly poor thinking. Where we can’t abide by the structures we’re putting in place, cause people to produce things that would be called insights that are really nothing more than a couple of observations.

06:48 MK: Do you think it’s just a fundamental… I don’t know, people misuse the term. Is that what it boils down to? ‘Cause particularly, and I agree with Tim on this. Yes, we actually agree. But actionable insight just makes me go a little bit nuts and I cringe every time I hear it. But, I mean, is it that we fundamentally misuse the word? I’ve had people say to me before, like, “Oh, we don’t want you to give us recommendations. We just want you to give us the insights.” And I’m like, “Huh? How do you have an insight without some kind of recommendation?” And it’s like, “No, that’s actually the job for the people that own this space. You just need to give them those nuggets of gold.” And I’m like, “Okay, I’m gonna go to lunch now.”

07:36 MH: Oh, nuggets.

07:37 TW: Well, I think what’s happened is that it’s a little bit of a self fulfilling… You hear about some grand insight that was uncovered then it hits a blog or a trade publication or something. And then the insight came from the data, and then that leads the author to talk about how everybody has this data, and this needs to be what we’re looking for. All of which is totally true. But that then shifts to, “Yes we’d like some of that too, please” from the marketer and sometimes the analysts as well. Saying insight driven by data, I have data, therefore I want insights. It’s not necessarily a two way thing. I mean, if the data leads to insight doesn’t mean all data, every analysis leads to an insight.

08:29 TW: I mean, I think we can trace the path of how it happened and I think that Rod’s pointed, basically that enough people were given whether a hard quota or a soft quota or a mandate to deliver insights… Well, you either are trying to do the impossible, or you just gradually shift the definition and lower the bar way down and then you say “Yay, I’ve hit it”. And in that way, you’ve got the label on the report, and you’ve got stuff under it that is information that the recipient didn’t have and so nobody says the Emperor has no clothes and we go on our merry way and we’re not really helping things. This is depressing. I need a drink.

[chuckle]

09:11 RJ: I really don’t like the association of the words “actionable” and “insight”. I think that the insight, it’s the precursor to action, it’s not. You put out ideas, you put out hypothesis, you put out plausible explanations, and then somebody from that may say that that is something I can take and do something with. I would say look at, going right back to Einstein with his thought experiment about traveling on a light beam. Not what you would say would be a pretty profound insight in terms of how do I explain the theory of relativity, but in terms of actionable, it took a long time before we could turn that into something that was actually useful.

09:52 RJ: I think that as a parable for how do we generate insights, we’ve often gotta allow the mental freedom to experiment with ideas that don’t necessarily make sense at the time, but in discussion with others, that then lead into ideas that we can progress into an experiment and the results of that experiment then taken away as learnings that we can apply to a bit context.

10:15 MH: Right.

10:16 MK: I actually, I was thinking about that the other day because we had an experiment that basically was just completely inconclusive. It was, “Meh.” And then everyone’s like, “Oh, but there was an AB test. What are the results? What should we do? What are we going to implement?” And it’s like, actually, the learning is we’re not gonna do anything. They’ve performed exactly the same. There’s no, there’s nothing to do here. And people, they couldn’t, it was this really hard thing to communicate because people are like, “Oh, but we’ve done something and there should be something.” And to me, I go “Okay well, there’s a learning here that’s not the same as we’re gonna take, our action is to do nothing.” But I don’t know, it just seemed this thing that people couldn’t get their head around.

10:55 RJ: Yeah, I think that we’re novices as a profession, when it comes to experimental design, and I think that when you look at tools, I won’t name names, because I think that’s not appropriate, but a lot of the tools out there will have very low bars to what they’re considered to be a statistically significant result. And even that word is very, very misunderstood in our community… Is effectively what we are saying is that we are reducing the probability of us getting a wrong decision. I love the XKCD Jelly Bean cartoon, so I can use that quite illustrative way to say, okay… Therein lies the problem. It’s hard to explain without seeing the visuals, but if you had Google “XKCD hypothesis testing”, you’ll find the cartoon and perhaps we can put that in the show links?

11:38 TW: Definitely.

11:38 RJ: And really what it’s saying is that, like 20 tests, that if you said a 95% confidence level, that you are likely to get one positive result for the fact that you did nothing more than run 20 tests. When you set the confidence level to 80%, or lower, you are going to get a test every five times, a positive test every five times. Now that doesn’t necessarily follow that that’s a true behavior. And it’s also quite a significant risk when we start to use machine learning models, for instance, in training data sets, because we maybe very well find patterns that are spurious. It depends on what we do next, is that, which is fine. We can use these sorts of tools as work flow tools, but how do we take that and turn into something that is actually meaningful for business. And in that case we gotta design experiments that we can then execute and then observe the results of the experiment to see if it did or did not actually make a difference. An experimental design, I think, is a topic that really needs to be much better exploring our industry.

12:35 MH: I agree with that. And before we get too much further along, Tim, I also need to just go on the record as also not in liking actionable insights. [laughter] There we go, all four of us now don’t like it. If you say that in our presence, you’re gonna get smacked in the head by Moe.

12:55 MK: I’ve got a question, Rod, which is about… And obviously, the other guys will have some thoughts on this as well, when it comes to working with clients, I’m just really interested to hear how you manage that. Because I’ve seen written into contracts, “This agency: We’re gonna deliver eight to 10 actionable insights every quarter.” And it’s like, “Well, how?” You might be able to provide analysis. Yeah, it’s this… I’ve seen it in contracts, I’ve seen it in my own performance plans about how many actionable insights I’m gonna deliver each month or quarter or whatever. How do you manage that? Do you just go along with what the business says or do you try and educate them? Or how exactly do you approach that?

13:39 MH: I’ll just say, we don’t write that into our contracts, I don’t know about you, Rod.

13:44 RJ: No, there’s no way in the world I’d wanna put at any form of KPI or obligation for us to do that because I think that’ll be setting ourselves up for failure. My job really is to be there as a facilitator and in my team as well, in terms of how they do with their clients. Our job is to pose questions and, if anything, the value that we bring to the table, is to help our clients ask better questions, because I would be very arrogant to say that we understood a client’s business better than they did themselves. In fact, I’m quite… But, you would know when people started bending this around a few years ago, the notion of the HIPPO, the Highest Paid Person’s Opinion, always won. And yet it was a great story, but often what you do find is that there’s a lot of implicit knowledge in people that have been professionals in their industry and we need to respect that. What our job is just to take the implicit learnings that got feelings and turn that into a structure that we can then help them to realize, is something true or false on a scale of probabilities. It’s a fiscalization role that I see, rather than, I don’t know the business, I can’t come with a magic hat and say, “Mate, cast the day to want to cross the top of it, pull that rabbit out and say, there you go, there’s a really nice shiny rabbit for you to go and take away and play with.” Doesn’t work.

15:01 TW: I wanna make sure we’re… We ultimately do want to help whoever our stakeholders are make decisions and take action. The way I’m feeling like we’re framing it is, okay, when qualifying questions, when they’re asking questions based on what they know from their experience, based on whatever data, whatever point we brought to the table, when they ask questions, I feel like it’s totally fine to say, “Okay, let me try to play out the different ways that the answer to that question might turn out and make sure that there are sort of different paths you would take, that we can frame a couple of different potential answers to that question, that would leave to different behavior. Otherwise, what are we doing? If no matter what the answer is, you’re not making any change.

15:51 TW: That’s where we’re driving to the actionable. Let’s have some focus around what we’re asking and what we’re delivering. Answering those questions sometimes may deliver an insight, sometimes may not. I’m getting a little squeamish on the… I don’t wanna go too hard down the path of, “Yeah, yeah. We don’t wanna be on the hook for delivering insights.” I think that’s okay if we have a high bar for the definition of insights. We don’t want that to get interpreted as, we don’t wanna be on the hook for helping you actually make decisions that move your business forward. It’s that helping you make decisions to move the business forward aren’t necessarily 100% driven by true, pure definitions of insight. Does that make sense? I’m trying to wrap my head around.

16:43 MK: But so how do you handle that?

16:45 MH: I think, Tim, this might correlate, because I always look at it as the pressure has to come from both sides. Moe, you said something earlier in the show where you were like, “I’m creating these insights and they tell me, “You can’t make a recommendation. That’s the job of this team over here.” And the reality is, in a perfect world, that team is coming to you and saying, “Here’s all the things we’re trying to figure out or wanna learn. Can you help us understand what those are?” And then they’re trying to do a bunch of actions and you’re helping them figure out what are the best ones to take. And insights are happening along that path so that both sides are pushing equally towards that. In your example of trusting the valuable intuition of that HIPPO in the same context, they’re bringing that domain expertise, the knowledge of all those years of experience, and we’re bringing our analytics data and experience. That equal pressure then creates the construct that allows an insight to be generated or a capability for recurring insights to happen.

17:52 MH: And actually, what I’m liking about the Gary Klein definition you stated earlier, Rod, is that unexpected shift in the way we understand things, because that’s the thing that happens so often is, a reframing of how the question is even being asked. If a marketing executive comes and says, “Hey, how do we get more money out of our email channel?” But actually the insight might be, let’s reframe that and say, “We might not want more money out of email because that’ll lead to unsubscribes. We may have an opportunity to increase our paid search to accomplish the same goal, but outside the channel.” Those are examples of how reframing the question leads to an alternate view which allows for a different problem set or problem solution.” I think that’s neat to think about as well.

18:46 RJ: I think it’s a very important point that you’re making there, Michael, in particular to do with the idea of framing it and the relationships between the moving parts of any given business. Reframing what we do in terms of when we look at a problem, we have a mental frame that we’re bringing to the table. If you’re thinking about this as a conversion problem for argument’s sake, you may very well be seeing this as a landing page test or something similar. If you’re looking at it from a system’s level, let’s say take the email campaign. If we decide that email works for the business, we might say, “Look, let’s push that lever all the way up to max and send an email every 30 minutes.” Obviously, in the long term, that will be a detrimental thing to do. By putting it into a system’s frame, where we start to think about what’s the relationships between things, changes that perspective and what we then take back to the client. Now we may approach that from, “Oh, yes. We know email works.” And, in essence, using what would be called “abductive reasoning”.

19:47 RJ: I don’t know if you know the difference between inductive, abductive and deductive reasoning. Deductive reasoning being where A plus B is always gonna equal C. Inductive reasoning, where we are designing a hypothesis and looking for evidence that supports that, we then execute that to become probably certain about something. And abductive reasoning, which is what most of us as analysts are doing, is finding the most plausible explanation given the facts at hand. Looking at that, we really… It does, I think, help to understand what framework we’re using when we start to approach the problem of developing insights. Now, that then leads us into how we change frames. So for argument’s sake, you might use de Bono’s Six Thinking Hats as a possible framework for this idea creation, and then exploration of those ideas, and then perhaps filtering those ideas. If you’re looking for more of a critical thinking approach, the correct reasoning and questioning is a very, very valuable tool that can be used to get to the cracks of an argument and to identify whether things are just purely speculative and false, or whether there is merit in the argument.

20:51 RJ: So it depends on your perspective, of course, but changing the frame changes the problem fundamentally. And I think that’s something, I think, when we look at the data, often we are looking at this from the perspective of, “I have data in front of me that is presenting a fact.” But we really need to say, “Okay given that, what do I then know and how do I reframe that to better understand the problem before I make a decision about what I present back to the client?”

21:15 MK: How do you bring people on this… I’m thinking about how do you get people to be involved in this process, rather than that mindset of “The analytics team are over there. They dole out insights and then we go and do stuff with them.” How do you get people involved in that?

21:35 RJ: I’m in a privileged position because I’ve been a consultant for a very long time. I’ve been in this industry for a very long time. I think it’s 20-plus years now, which is crazy when I think about it. In that journey, we have built up a perspective that some clients accept and we can go in there and say, “This is the way that we prefer we do things.” It would be difficult, I think, for somebody starting out, because, and then you got that classic 10,000 hours of experience, “Well, I’ve got my 10,000-plus.” And during my journey, you’ve done a lot of things that worked and a lot of things that… More things that didn’t work. You learn from ’em. My challenge right now as the owner of a business that does analytics for other people, is to bring a team up on that journey. And the only way we can do that is just through nurturing and mentoring, and providing the opportunity for people to explore as they progress through their careers from the very basics of, “How do I pull data and how do I visualize that data?”, through the “How do I actually interpret and apply that data in a business context?” It’s a real challenge to build a sky-level analytics team. We can build a team that can produce numbers and we can build a team that can produce PowerPoint charts. But it’s a very, very difficult thing to actually build a team that can do in essence, what a good analyst will do.

22:50 RJ: I don’t know if it answers your question, but from my perspective, it’s an easy thing, how do I take it and grow it? And certainly in your situation, Moe, in your iconic there, when you’ve got a team that’s already built around the notion of specialties, is how does your speciality impact somebody else’s? Very tricky.

23:08 MH: This is the part of the show where I spend the next 30 minutes talking to Rod about that specific topic. [laughter] Because literally, this is what I do. It’s like, “Yeah. How do you build a team of amazing analysts?” But I will also say, Moe, you would think of yourself even in your current context as something of a consultant to your organization.

23:29 MK: Oh, for sure.

23:30 TW: Yeah. I think the two managers at agencies building teams took Moe’s question and ran with the “How do I build the analyst mindset?” And I think Moe, your question was more around…

23:44 MH: Yeah. How do you…

23:45 TW: Well Michael, the way you framed it where it’s two parties coming together. And Moe’s question is basically, what happens when the other party is not receptive? Already feels like they know exactly what you’re supposed to do, what your output’s supposed to be. You were heading that way?

24:01 MH: I was getting there, but if you got something to add, Tim, go for it. [laughter] I’ve been around a long time.

24:07 TW: Well, to me, there is that level of needing to build trust. You have to build that base level of, “I know what I am doing, I know what the data, I can spit out the numbers, I can spit out the PowerPoints.” But not let that go so far, that now… I’ve always felt like I’ve had to express a little resistance. Say, “I can do it, because you don’t know me well enough.” Not saying it explicitly but, “You don’t know me well enough to know that that’s not really the most productive way for us to go.” But I’ll gently say, “Well I can do that, but while I’m doing that, can we also talk about this other thing over here?” Or, “It sounds like you are saying X and Y. ” Which if done well, and again this gets back to, I think it would be hard for me to do when I was 23, 24, 25, but to say, “Oh, it turns out that just looks like an active listener who is engaged with what their problems are.” And they just think I’m really engaged trying to have a discussion, what I’m really doing is trying to bring them along with me to where the data I can go look at is actually aligning.

25:19 TW: It’s literally this weird balancing act, walking along. My favorite anecdote when I was at a Creative Agency was when the account team went out to kick off with a huge retailer, and they came back and they told me, “You’re gonna love these guys, they are all about the data. Look, we even brought back this binder of reports that they had, and we said that you would go through them and would give them some insights. And we’re also getting you a log in to the Adobe Analytics for them, and we told them that you would go through and see what recommendations you can make from that.” I’d never had a meeting with the client, didn’t know what the issues were with their implementation, and that was the definition where they were like, “The client’s super excited about data.” And that almost makes me terrified when somebody, ’cause it generally means they’re excited, but they’re not necessarily approaching it the right way. They’re like, “I can produce reams of deliverables internally and produce charts, but I am not necessarily approaching it with the good deep rigor.” And I’ll finish those 17 points and pass the baton to somebody else. [laughter]
[laughter]

26:31 RJ: [26:31] ____ may jump in. But the model of, about this when I try to work out what’s effective, it’s about constraints. What I mean by that is, if I’m sitting with a client and they’re saying, “Here’s what I’d like to achieve.” And, let’s for argument’s sake say it’s the optimization of a form or some other process. Let’s take insurance, insurance is a good one. An insurance form for, say life insurance, may be 10 pages long. Now, one of the constraints that we have is we can’t really optimize the form, because there are certain underwriting questions that need to be asked, in order for the business to make an affective underwriting decision. Now we may have the constraint to being able to change the layout and order of those fields in the form, but we can’t change the contents of those fields. So that’s a constraint that we need to be aware of, so where we narrow our scope of the discussion is what are the things you can actually change, and which of those are likely to get the best possible outcome for you.

27:29 RJ: Now in some cases, the constraints are so tight that there is limited area to move and, in which case my default answer is what, perhaps we’ll park that one and move onto something else where we don’t have those constraints. Part of, I suppose, the wisdom that I’ve learned, if there is such a thing in my industry, is getting to know the client and what their capacity to act is. Often, when dealing with people at a fairly… Down at the lower end of the organization who is, and they don’t have the ability to, for instance, propose and execute a major project. In that case, one of the constraints is the decision making constraint. That may very well be solved by getting executive buy-in, where you can perhaps go up to a more senior decision makers and start to convince them of the need to invest to this particular problem. With a major retailer, for instance, changing checkout close to Christmas would be suicidal [laughter], so you wouldn, t do that. You better know your constraints before you can start to identify what are the things that you can actually do to make a difference to that particular company at that particular time.

28:34 MK: Sorry, and Rod, how do you go about… ‘Cause I’m just thinking, and it was something that I actually wanted to touch on about. When you’re the analyst and people keeping like, “I want actionable insights.” And you keep dolling them out and they keep doing absolute squat with it. And it’s just really fascinating how you frame that around constraints. When you meet with people, how do you go about really uncovering those constraints? Obviously, active listening and… The analyst doesn’t always have that luxury of getting to see with the decision makers or getting… I’ve been asked to do analysis team before, when you haven’t even met the client. You haven’t even had a conversation with them or know their email address, and it’s like, “Here, give me some actionable insights.” How would you tackle that if you were the analyst?

29:25 RJ: General rule of thumb is, you got two ears and one mouth and you use them in that proportion. I think there’s a red button there…

[laughter]

29:32 MH: Good advice.

[laughter]

29:35 RJ: I’d start with the listening, we need to be… Active listening is important, but even if you can’t get to meet the client, you can find things out and LinkedIn is a wonderful thing, LinkedIn is fantastic and the amount of information that can be found out about a particular person and their role, from what they’ve published in LinkedIn, Blogpost, Twitter, etcetera, can be absolute gold when it comes to understanding what’s likely to be firstly, in their interest, or their KPIs, and secondly, what are their capacities to act. If, for argument’s sake, you’re dealing with the email marketing professional, obviously their constraints are going to be limited to what they can send out in a email, the volume of emails, the segmentation of the list. Already we’ve narrowed out the main down two to something like that, so our observations are going to be very much based around the things that are likely to be interesting to them to start with. Now, this is classic sales, if you ever picked up a phone or had anybody give you a cold call and [30:31] ____, how you got this wonderful thing, tell me about your company. Well, you fail because you haven’t done your homework, because what you need to know is to understand before you have that conversation, what the likely interest areas are going to be. The do the math before you have that conversation and, mind you, that genuinely makes for a much more successful outcome.

30:51 MK: Well the good news is I am picking up some fantastic tips if I ever decide to setup my own agency, because this is just getting dolled out right now, this is great.

31:00 MH: No, I concur, these are very great, Rod, and I would say the same thing happens within an organization too, all the same things apply. You go talk to the merchandising team about something, you want them to buy into what you wanna give them, and you want them to tell you their problems and what the problems they wanna solve. It’s very much the same activity.

31:22 MK: Okay, so what about the action point then, right? Let’s say you go to all these meetings, I really feel like this is like a couch therapy session for me, so this is great. But you go all these meetings, you get the relative stakeholders on board, you allocate the time, you do the analysis, you come up with some great gems, and then they do nothing. And then, two months later, they ask you for something different. I’ve actually been talking about this with lots of analysis lately, and I’m like, “Okay, so do you go back a week later and ask them what they did with the analysis you gave them or like… ” People can get really excited about stuff and then, by the time the work’s done, they’ve moved on to some other shiny object over there. I am interested to hear how you guys would handle that.

32:11 RJ: I could jump in again. I just see it as, if we haven’t communicated well, then they may not have understood, that’s the first thing. If they understood, then it’s a matter of do they accept or reject or some gradient therefore? Until a clients accepts it and says, “Yes, I do accept that that is valid,” and then, following that, “I have the capacity to act on it.” Once we got through that, it’s a matter of commitment. So maybe they don’t have the capacity to act now, maybe that’s something used at a later date or, if they do have the capacity to act now, but they are not acting, what are the barriers that are preventing them? That could be [32:47] ____, there may very well be inertia. In keeping a record of these, it’s really a matter of saying, do they accept or reject the argument that you are presenting first off? Now often, these sorts of things when they drift, in my experience, the better form of polite, we don’t accept. We don’t want that recommendation and people don’t want you to start questioning, “No, that’s a piece of poo [chuckle], don’t want anything more to do with that.”

33:13 MK: So you think it’s more the “I don’t accept the recommendation” more so than the capacity?

33:19 RJ: It’s in that communication, we can’t be analysts without being good communicators, and that is a challenge for most people in our industry, ’cause most people in our industry, because you are an analyst, you tend to be more introverted, you tend to be more detail focused, you tend to like the working out, but the business person’s perspective, they don’t wanna see you working out. If you’re good at what they do, they trust you, and if you need to show them the working out, they trust that you can give it to them when necessary. But if you haven’t persuaded them of the commercial imperative or the benefits of that, then they are likely to say, “Oh, okay. We’ll just park that. That’s in the too hard bin for the moment.”

33:54 MH: I do think there is the inertia, it’s backed a little bit of the human nature. I think the solution is still the same, regardless of whether it’s they don’t accept or whether it’s too hard, and they don’t wanna put the effort in to actually make the change, or any change they make, there is a risk. If you do the same thing, you rarely get punished. If you make a decision and it goes well, you get rewarded, so there is upside, but if it goes poorly, you get punished and you can’t make a guarantee. Back to, as Rod was talking, ’cause earlier, as you were talking earlier, the voice of Mack Irshoff in my head saying, “Where an analyst misunderstand their role, we are operating under conditions of uncertainty, and analysts don’t understand that. That uncertainty can’t be eliminated, and marketers and project managers also don’t embrace that, so when you’ve got that, and you’re not comfortable living with, “Yes, I’m making a recommendation and I, the stakeholder, need to invest effort in order to make a change, that there is no guarantee.”

35:03 MH: The analyst won’t promise me or guarantee this, it’s sometimes easier to not make the change. But I think, back to point of probing and asking and listening and putting on yourself to say, “What can I do? Was it that I didn’t convince you?” I mean, not saying it that way but, “Hey, I understand we didn’t make this change and my assumption is that’s the right thing, as we didn’t take that action, but for me to be better at my job, can you help me understand why?” And if it’s, “Hey, we plan to, we’re just heading into holiday, we have to do it in February.” Great, I’m gonna make a note on my calendar and say, “I’m gonna come back in mid January, ’cause they don’t really have any reason to necessarily remember it. If they say, “Well, we sort of thought about it, not sure we really bought it,” then I’m gonna say, “What can I do to validate it further?” Then again, to Rod’s point, it comes down to the communication and the listening that, people aren’t trying to do the wrong thing. They’re not trying to undermine the analyst. But it just takes a tiny little disconnect and then nothing happens, and people get frustrated and left scratching their heads.

36:17 TW: Yeah. I think it was for me a very big turning point in my analytics career when I realized that most decisions making is emotional, not rational. And your data, as great as it is, is nothing against the story that’s associated with the narrative that’s gonna be the persuasion or the communication of the idea. And so having that communication skill, like Rod just said, is so undervalued to actually drive change. Then there’s one other thing, which is, we’re all going in under the assumption that the people on the other side actually wanna see good things happen. ‘Cause there are organizations that are just stuck. And they are pretty okay with nothing happening. That’s not a “you” thing, that’s a “them thing. And you have to learn how to identify those as well. But otherwise, yeah, good stuff.

37:14 RJ: I think your point too about risk is really important too, because often I’ve found, in my experience, the more senior the person you’re dealing with, the more comfortable they are with ambiguity, risk, and uncertainty. And when you have people that are more at the call face, and this is very heavily impacted by the culture of the organization, the fear of failure is often a big driver rather than the rewards of gain. I think Daniel Kahneman and his work with behavioral economics is… Loss aversion is worth twice gain. So for every one unit of gain, if you lose something it’s worth two units. It hurts more to lose.

37:57 MH: Wow. Kahneman. I think it was [38:00] ____ was our last Kahneman referencer possibly.

38:05 RJ: We could throw Thaler in now. Richard Thaler, he won the Nobel Prize.

38:08 MH: That’s right.

38:09 RJ: And I think we can learn a lot about what we do with the concepts that are put in his book of Nudge. I don’t know if anybody’s read that, but I would put a shout out to that one. It’s worth reading.

38:20 MK: I feel a little deflated after all of this talk and…

[chuckle]

38:29 MH: Well, we can fix that, Moe. Come on. Let’s do the analytics cheer [chuckle]

38:34 RJ: Yay.

[chuckle]

38:36 MK: The reason that I am feeling a little down after all of that is it sounds like, basically, to really nail things, I need another 15 to 20 years experience. I need to go do some really intensive active listening courses. And it still doesn’t help me tomorrow in my fundamental quest as an analyst to think about… It probably helps me get a little bit of perspective the next time I’m talking to a stakeholder. But ultimately, when it comes to this term “insight”, do I try and educate people on what an insight really is or do I let them just think their own definition and persevere with my job? I’m having an existential crisis over here.

[laughter]

39:29 RJ: I don’t think it’s a warranted problem, I definitely don’t think it’s warranted. I think that knowing the factors that we’re up against is half the challenge. You go in there with your eyes closed, thinking that if I just keep pushing harder, that it’s gonna make a difference, is not a recipe for success. I think that even the simple concept of reframing and starting the think through that, “Well, what are the frames that I’m operating in,” is very empowering to people once they get it, because If we look at this in a way that is all about, “Well, this is what I know and therefore I must apply it,” we’re not too dissimilar to that old proverb, “When the only tool in you toolkit is a hammer, everything looks like a nail.”

40:12 RJ: Suddenly realizing there are other tools in the toolkit, the spanner, the screwdriver, the drill, suddenly opens up a whole world of possibilities. I think it’s the way I saw it. This happened to me, I’m not sure how many years ago, when I just realized that what I was doing wasn’t working with our clients. The arrogance that I knew better than our clients because I had the keys to the kingdom with Google Analytics and Adobe and the rest of them. When that fell away, it was actually quite empowering to say that, “Oh, my job actually is to interpret the data and to help our clients ask better questions.” And that, I think, was a real turning point in my journey as an analyst. It took me a long time to get there.

40:49 MH: Yeah, but I don’t think, Moe, you should feel bad at all ’cause actually the funny thing is you’re probably already doing a lot of these things. And it may be not at the scale you want to do them or the impact level you want to have. But neither are any of us. Like am I hitting the scale I wanna achieve? Not by a long shot.

41:10 MK: Oh, great. So we’re all failures together [chuckle]

41:13 TW: No. No, no, no.

41:13 MH: No, not remotely. No, the idea is, keep building on these things. Every single thing I learn, I incorporate into the next time I go into a… The next time I take it at bat, I’m gonna have these three, four, or five more things that I’m gonna be able to leverage and use as my tool… Kind of Rod’s, all the different tools I can pull out of my toolkit. And it’s also, even on a small scale when you have somebody walk up to your desk and be like, “Hey, can you show me all the visitors who visited 15 times last month?” And it’s just a really weird question for someone to ask. Just by even asking them, “That’s an interesting question. Why are you trying to figure that out?”, you’re already driving that one level deeper and helping them reframe and then giving them access to everything you might know that would help drive to their decision making capability.

42:04 MH: Again, delivering an insight, it’s not… And this is one of the the thing that can be frustrating is, insights and actions are not all in the hands of the analyst, typically. In my career, we actually had a moment where our analytics team was suddenly doing most of the implementation or follow-through on the insights we generated, and we had to step back and be like, “How many projects can we effectively manage to drive the business forward and keep doing our regular jobs?” And we had to… But what that did, was then created an organizational conversation about, “Okay, let’s loop you into the IT prioritization process. The time frames, the windows, the rest of business is operating on, so you guys are in that.” And we grew our ability to bring contextually relevant insights with all that information that we didn’t achieve. Again, it’s a building process. You’re probably just fine, and let’s be honest, you’re way smarter than, well, at least Tim and I. Well, by the time you get to the point where we are, you’re just gonna be unstoppable. It’s gonna be ridiculous.

43:19 MK: Oh. This a confidence boost.

43:23 TW: I completely agree with all that, but let me tie it together another way to… Let’s take just the insight. What Rod said, here’s what is a good definition and that means it’s a big thing that you’re not gonna stumble across weekly, monthly. Okay. Take that definition, that means you’re gonna be infuriated by all sorts of people who are asking you for insights. Take what Michael shared as an example early on. He was like, “I just fucking deleted the word insight from the reports.” And did he go have the battle about it? I don’t think he went and announced and circled in red and said, “Notice. You guys are using this word wrong and I’m right and I’m taking it away.” He just subtly…

44:06 MH: I was ready… But I’m also…

44:10 TW: You had the highlighter.

44:11 MH: But I’m also the man of the macho when it comes to, or whoever the Don Quixote… I’ll tilt to those windmills and it’s to my own detriment.

44:20 TW: That’s good, the novel and the musical. But that’s my point is that If you actually have, the stronger your framework, or frameworks or your approach, that’s the ideal. I’ve never had a client that, when I’ve gone in and said, “This is the ideal process, if we follow this I’m confident everything will go awesome.” And I’ve had exactly zero clients ever who have said, “You’re the expert, I’m gonna do exactly like you’re saying, you’ve completely changed my thinking.” That’s never going to happen. But having that as my own sort of Load Star… Lodestone? Guiding Star… Whatever, I’m mixing stuff up.

44:56 MH: North star?

44:56 MK: North star?

44:58 TW: North star? Lodestone? What’s a lodestone do? Who knows. [chuckle] But having that is my, this is the ideal that I’m heading for, then, as I’m having interactions, no matter how frustrating, it’s like how can I… Nudging somebody a little bit in that direction still helps. If I just get frustrated with them and say, “We’re gonna keep turning in the same spot, I’m not moving anywhere.” And along the way, by the way, I’m learning, to Rod’s point, you learn some things work and some things don’t when you’re trying to steer people along and different personalities, and sometimes there are lost causes. I can name the very specific people and say, “You know what? I’ve tried every fucking technique I can come up with. My patience is completely exhausted and they are just not gonna get on board and I’m gonna find a way to not spend time or energy trying to help them.” But the good news is that they’re kinda frustrated too, so they’re not asking me for as much help.

45:56 MK: Okay. Basically, I need to be uplifted by incremental progress towards bringing people on an analytics journey. That’s my summation of what the 3 of you have said.

46:09 TW: And keep drinking.

[laughter]

46:10 RJ: I would say, think of it as you always wanna grow. It’s a profession. It’s not something that’s like, I’ve just done a Uni degree, I’ve walked out, I’m an analyst. That’s it. You just get better and better and better at it, and I think that if you’re constantly challenging yourself and looking at, “What is my thinking on this? Am I right? How do I know that?” That self-questioning, I think, is part of it, and yes, that means a lot of introspection and, possibly, you get into a bit of the doldrums every so often. But it does takes you a longer the journey at least to the greater good.

46:41 MK: And ultimately that’s what Web Analytics Wednesday is for. To go and have a drink and talk about this stuff, instead of crying yourself to sleep.

[vocalization]

46:51 MH: Or any networking events or conference or podcast.

46:54 MK: Yeah, podcast for sure.

46:57 MH: Just to save all our listeners from having to write in on a specific topic, I’m just gonna come back to lodestone for just a second.

47:05 TW: Oh, dear.

47:05 MH: So, Tim, a lodestone is a naturally occurring magnetic rocks used in the first magnetic compasses. And I think what you were trying to get to was using a lodestone to create a compass to guide you.

[laughter]

47:21 TW: You’ve referenced that. ‘Cause I can look for the North Star, then I’m only working at night and frankly. Some of my stakeholders work during the day, so a lodestone helps me make my little compass.

47:31 MH: There you go.

47:31 TW: Yes.

47:32 MH: Sorry for that non sequitur, but there we go. I feel like we haven’t talked about the North American definition of insight from Christoper Barry as much. But that’s also worth a read, so I’m just gonna give a little shout out. Alright, this is maybe one of my favourite shows we’ve ever done so far. And Rod, I certainly like the cut of your jib, sir. [laughter] No, I think, profound… Some really profound things that you shared from your experience, and I think our listeners will get a lot out of it. One of the things we do on the show is we do a thing called The Last Call. Means we just go around and share something we found recently that we think might be of interest to people. I don’t know who wants to start, but we’ll just do Last Call with everybody.

48:21 TW: Rod, start.

48:22 RJ: Two things I was thinking of when you went and proposed with this. The first one is a blog, and Astrid Cena, who is one of my analysts here, put me on to it a couple of years ago. So it’s not recent. It’s called the Farnam Street Blog. The reason I thought this was appropriate for this episode, it’s all about thinking about thinking. The other one, which is a bit cooler, is a website called Windy, windy.com. If you wanna find out what the weather patterns are in your area, the wind, the ocean, the swell, the clouds, the rain, fantastic visualization of data.

48:52 MK: Whoa, you’re not kidding.

48:54 RJ: That was also brought to me by Brandon. He’s one of my analysts here as well.

48:58 MH: Nice. I should try to get my analyst to tell me cool things that I didn’t look at.

49:03 TW: I’ll take one that is not particularly pertinent to the episode, but I just backed it, stumbling across it. It is, warning, it is Adobe specific and it is definitely tactical specific, but Trevor Paulson and Jerry…

49:19 MH: Are you about to steal… Oh, you jerk. You literally stole my, you literally did.

49:25 TW: Outstanding. Trevor Paulson, Jared Stevens from Adobe? The datafeedtoolbox.com? That’s outstanding. I guess I could try to go with a different one. You want me to do a different one?

49:35 MH: You totally took it. No, keep going. I’ll scramble.

49:40 TW: It’s only got a handful of posts, although they’re posting fairly regularly, but it’s Adobe’s data feed. All the stuff that isn’t Adobe’s environment for actually getting data out…

49:52 MH: That’s your punishment for stealing my Last Call.

49:55 TW: Datafeedtoolbox.com is basically the data feed getting the raw hit data streaming out of Adobe, which Adobe doesn’t really talk about that much. But Trevor and Jared are both at Adobe, both super sharp guys and they literally walk through like, “Oh, if you have the data feed and you’re going into Hadoop, here is this thing you can do with it, with our… Or something else.” My dog approves of the datafeedtoolbox.com. Michael, would you like to just second my last call and see if you can do it without canines in the background?

50:28 MH: No, Moe, what do you have for Last Call?

50:31 MK: Okay. I found a really interesting post by a guy called Jeffrey Shaffer. He’s a lecturer and he’s built a tool that allows you to change the Tableau version of a report that you’ve built, which came in quite handy. For anyone that’s a big Tableau user, I find that one really interesting and we’ll make sure to put it in the show notes, but I also wanted to do a particular call out this month to any listeners who haven’t joined the Women in Analytics chat group on Slack. We have got some of the most fascinating resources and discussions going on in that channel and I just wanted to do a big shout out and say, thank you so much to the amazing community. We’ve got men, we’ve got women and the conversations have been terrific. If you’re not already signed up, I really encourage you to join the Women in Analytics channel on Slack.

51:22 MH: Very nice.

51:23 MK: And what about you, Michael?

51:27 MH: I was gonna have a really good one and frankly explained way better than you did, Tim.

51:36 TW: Fell free to do that [laughter]

51:37 MH: Trevor Paulson, who I actually had the pleasure of meeting at the DA Hub a little while back, has a great blog. He and some of his co-workers work on. No. That is actually a really great blog that helps go that next level with any Adobe analytics implementation, you’re not really doing it right til you’re working with data feed. That blog is a really good one. There’s a good amount of information out there in the G8360 world on how to get into big query and that sort of thing, but not much on the data feed side of the world, so a welcome addition. Thank you Tim, thank you so much.

52:15 TW: Anyway.

[laughter]

52:17 MH: Listen, if you need a lodestone for your compass, or you just need a conversation about analytics, or you’ve been listening and, like Moe, you need some encouragement that you’re on the right track with delivering insights to your organization, we’d love to hear from you. We’re available through the measure Slack, through our Facebook page and on our website at analyticshour.io, and we’d love to hear from you. I’m sure Rod… Rod are you on the measure Slack?

52:49 RJ: I am indeed. I’m not a very active participant though. I’m not very good with these sorts of things. It’s simply…

52:55 MH: That’s fine.

52:56 RJ: [52:57] ____ fairly limited.

52:57 MH: As long as people know where to find you, I think that’s the important thing and I think certainly you’ve got a lot of wisdom to share. Look him up and ask all your questions and then he’ll be like, “Man, why did you make me answer all these questions? We’ve got a business to run here.” [laughter] No, but we’d certainly love to hear from you. Rod, thanks again. Really loved what you shared. I think it was really excellent. As for my two co-hosts, Moe and Tim, everybody out there keep analyzing, get into those insights.

2 Responses

  1. […] #076: Insights, Please. Actionable Ones! With Rod Jacka […]

  2. […] Analytics Power Hour co-host Michael Helbling says an impression among many BI buyers is that “all we need to do is get the right tools, and amazing […]

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#243: Being Data-Driven: a Statistical Process Control Perspective with Cedric Chin

#243: Being Data-Driven: a Statistical Process Control Perspective with Cedric Chin

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_243_-_Being_Data-Driven__a_Statistical_Process_Control_Perspective_with_Cedric_Chin.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares