#058: Analytics in an Agile Organization with Simo Ahava

FINALLY! It’s a show all about Google Tag Manager! Oh. Wait. What’s that? We had Simo Ahava on the show and actually covered a different topic entirely? WHAT NINNYHEAD APPROVED THAT DECISION?! Well, what’s done is done. With ‘nary a trigger or a container referenced, but plenty of wisecracks about scrum masters and backlogs and “definitions of ‘done,'” we once again managed to coast a bit over the one-hour mark. And, frankly, we’re pretty pleased with the chat we had. You’ll just have to go to Simo’s blog if your jonesing for a GTM fix.

Mentioned in This Episode…

 

Episode Transcript

[music]

0:00:05 Announcer: Welcome to the Digital Analytics Power Hour. Tim, Micheal, and the occasional guest discussing digital analytics issues of the day. Find them on Facebook at Facebook.com/analyticshour, and their website, analyticshour.io. And now, the Digital Analytics Power Hour.

0:00:28 Michael Helbling: Hello everyone. Welcome to the Digital Analytics Power Hour. This is Episode 58. Raise your hand if you wanna be more agile. Now, slowly reach back behind your head, and grab your other hand behind your back. Okay, no seriously. If you’re driving you should stop this right away. This isn’t the agile that we’re talking about. On this episode of the power hour, we’re talking about how more and more organizations are changing their development efforts to Agile. And as analyst we need to operate effectively within that kind of model. There’s only one problem, or maybe there’s many problems. But one of them is, Agile is often spoken of, but not necessarily always followed in the same way as across multiple organization. So what’s really happening? So in this sprint, we’re gonna need someone to help us groom the backlog. So, Tim my co-host, hello Tim.

0:01:26 Tim Wilson: Hey Micheal.

0:01:27 MH Hey. And of course, myself. We needed a guest. We’re not developers. Well, Tim your sorta turning into a data sciencey type of developer with R, but we need someone to help us figure out Agile. So we are please to welcome, our guest, Simo Ahava. If you have ever searched for an answer…

0:01:47 TW: Should we pause for the applause to die down?

0:01:50 MH Oh, yeah that’s right. Ahhhh… Ahhhh… No, it’s good. So if you have ever searched for an answer to a Google Tag Manager question, chances are you have already read Simo’s writing. He is a Senior Data Advocate at Reactor, and prior to that he also was an analytics lead at Net Booster. He is the guy with all of the GTM answers, but he’s also an advocate for Agile in analytics. Welcome Simo to the show.

0:02:22 Simo Ahava: Thank you so much for having me. I’ve been waiting for this for a very long time. I just love this chat I’m gonna have with you guys.

0:02:30 TW: Well, don’t screw it up.

0:02:31 SA: I won’t, I promise.

0:02:33 TW: This could mark the point, years hence, that you look back and say, “Wow, that was the peak, and it was all downhill from there.”

0:02:39 SA: I’ve had a good career. I’m fine with that.

[laughter]

0:02:41 TW: Okay. It’s been a good run.

0:02:42 SA: It’s been a good run.

[laughter]

0:02:46 MH So I’ve already expended my knowledge of Agile in the intro, with words like ‘backlog grooming’ and ‘sprints’, but why don’t we kick off with, what does it mean re-actually to be agile?

0:02:58 TW: Or actually, what’s it mean and how does it get butchered and bastardized, as soon as somebody says, “There’s the scrum master”, does that mean their doomed to fail?

0:03:07 SA: It’s definition is right. So there’s so many ways to approach it. I guess there’s so many different ways of doing it. I think the easiest way to start with defining agile is just to go to the semantics of the word itself. Like when you say something’s agile, what do you think about? You think about ninjas, and nimble things, and really dexterity stuff, and awesome reflexes and fast reaction times. And I think that’s all part of the paradigm. When we talk about agile in terms of development or any kind of organizational efforts, we’re talking about really fast reaction times. We’re talking about being able to react to the kind of flux that the digital world is in. We’re talking about being able to bounce ideas quickly and validate them, whether for good or bad. It’s kinda sophisticated second guessing. And we’ve been given the permission to do so, by organizations that are typically very, very careful about where they spend their budget and their time. So agility is very… I think it’s a good word for what we are shooting for. But it’s very difficult to pin it down as one certain set of principles. But there’s this kind of underlying theme of agility as something fast and something very kinda ephemeral in a way. That’s what it’s really all about.

0:04:24 MH So how do we get from that concept to, the very formalized agile methodology? I guess, which came first? Did people say we just need to be more agile, and then slowly that sort of evolved to… And this is exactly how, this is the prescriptive way or is it accepted to say we can take, as long as we’re being agile. Are there a couple of cornerstones and underpinnings of, you can adjust in these ways, but the things that absolutely mean your not truly being agile, even if you are having a daily standup and you have somebody called a scrum master. How rigid is the label of agile applied?

0:05:05 SA: I think it’s applied rigidly when it shouldn’t be. I think your second example was a good one. There’s official efforts of standardizing agile, into these frameworks like scrum or lean. And that’s fine. I think it’s very important to have a formal set of instructions, for example. It makes it easier to adopt something if you’re completely unfamiliar with it. So for example, I’ve gone through scrum training, I’m a certified product owner, I think was the title. But I don’t see that training as being what turned me on to agile today. One reason why agile has become so popular, and why it’s always been so popular, I’m talking about like the history of software development, it’s especially in cases where we have all these public backlash towards the waterfall model and this pre-determined steps and very rigid development models that, I think, over and over, are being proven as very ineffective and yet organizations still participate with them. And I think agile is partly as a reaction to the ineffectiveness of these models, which are very, very deterministic. You have these very explicit steps. For example, you have planning steps, which feed into development steps, which feed into testing steps, which feed into user feedback steps. And these steps are these huge, monolithic blocks that have entry and exit conditions.

0:06:34 SA: And I think, at least in Finland, for example, where there’s been so much talk about how public IT projects are just horrible, because they’re all… The RFPs for them have this, “Here’s an Excel of 1,500 rows of features that we want your tool to be able to provide for us.” And then these vendors who wanna win the case just put, “Yes, our tool can do that, yes, our tool can do that, yes, our tool can do that,” for 1,500 rows. And then, skip ahead three years. The budget’s blown over by 3,000% and you’re left with a bunch of features that nobody’s actually using.

0:07:09 SA: And I think once that happens often enough, and it seems to happen all the time, at least in the public sector, I think the reaction is very generically moving towards more introspective approach and retrospective approach to development, where instead of doing these huge, big, monolithic steps, people want to try out what it might be to start working with prototypes, and develop those prototypes iteratively, instead of fixing everything in one go. I think as a reactionary movement, it’s very easy to slip into an agile mindset without even knowing anything about the actual frameworks that underlie agile. And I firmly believe, and I hope we can talk about it today is, there are ways to be agile without forcing your organization to go through this really big, fundamental change. You can adopt certain ideas from agile methodologies into your everyday work very easily, and your team’s work, without disrupting something that in reality might require, for example, shareholder approval, if you’re trying to change your entire organizational matrix into something more fluent, for example.

0:08:17 MH No, I think that’s good, and it’s certainly… There’s aspects of agile that fit very well in the world of a digital analyst, especially an analyst who is trying to grow and build an understanding of analytics within an organization that they’re working in. So, these sort of iterative steps, or rapid prototyping concepts are ones that you actually see people using. And kind of to your point, it wasn’t like a big rollout of a methodology across the enterprise, it was a person trying to make small successes happen to grow a program or things like that.

0:08:54 SA: It’s very interesting with analytics, partly because, when we talk about agile in software development, we’re talking about IT and the development engine. We’re talking about a vertical within an organization. And it can be completely removed from marketing, or from PR, or from internal development stuff and research. It’s like a vertical and it can exist in its own bubble. I’ve seen organizations which have a very agile software development engine going on, but the company itself is very rigid in its formative ways, or formulaic ways.

0:09:29 SA: Whereas with analytics, once you start thinking of analytics as a similar vertical, you’re talking about silos, and for analytics, I’ve always considered that to be very detrimental. To consider analytics as just one part of an organization which needs its own dedicated resources. You can do that, and I guess quite a lot of organizations actually do that. They’re, in a way, forced to do that, because they need job descriptions, they need budgets, and they need budgets directed at a department. Whereas analytics, in my view, the only way it works in an organization is if it freely flows through all these different departments.

0:10:04 SA: So the big problem with an agile analytics model comes when you try to build the same kind of a data organization with both agile departments, such as software development and something more rigid, such as business development, for example, or the sales process, which might have very, very… It’s difficult to be agile in sales. It works, but it’s difficult as a concept, because you have things like RFPs and then you have negotiations, and so on. So, with analytics, it’s not just about being able to implement an agile process for doing analytics. It’s also about how to be as flexible as possible, and still be able to produce the best data for all these different requirements.

0:10:43 TW: I’m a big proponent of… And anybody who’s heard me rail about hypothesis validation and performance measurement, that from the silo of analytics, a challenge I will have is, it’s like a stakeholder will come to the analytics team and say, “Please analyze the website.” Or, “Please analyze this campaign.” And the shift that I’m often trying to get them to make is saying, “Let’s break this down into the hypothesis that we wanna validate.” And I honestly don’t wanna validate… I don’t wanna go and sit in my little office for two weeks and then come back with a 45-slide deck that might be really well formed, but I’ve got five hypotheses that I’ve validated and I need to walk you through, that’s just a challenge to communicate.

0:11:31 TW: Whereas, I’m much better saying, “What’s your top priority? What’s the big question you have right now?” Let’s maybe list multiple questions, but as soon as I have validated the logically first thing that is gonna inform other stuff, maybe I don’t need to present it to you. Maybe I can send an email that is just like, four sentences and a chart. Because that may spawn more questions, which I’m okay with. I watch analysts who say, “I did the analysis, and they just had more questions and wanted me to do more. I’m never gonna be done with this.” And I’m like, “Really? That’s your reaction?” You’re actually engaging and partnering with them potentially as an ongoing back and forth. You have questions, we have answers, and we’re trying to drive action and manage that this is actually driving change.

0:12:15 TW: Is that sort of mindset in line with where you’re seeing it should it go? ‘Cause that can be kind of… The challenge can be a rigid organization that says, “No, we have one release. We’re doing one update to the site this quarter. Therefore, we must do the monolithic analysis to support that, or we did one release. We need to do the monolithic analysis,” because that can be very rigid, and I feel like I get stuck trying to say, “Let’s manage this in more bite size pieces,” because you’ll be able to consume it, it’ll be more timely. And I am not asking you to kind of… It’s not one shot one and done, it’s kind of an ongoing thing. Is that… The way I’m articulating is that in the line with kinda what you’re…

0:12:54 SA: I think the context you provided is very familiar. One of the first things that, in my view, agile suffers from is outsider… I don’t know the… It’s a… I am gonna blame my not English as a native language, but I’m gonna say outside contamination. So we have an outside influence on our process, and they swoop in with these demands that might not be in line with how we as experts have designed our development process.

0:13:25 MH Outside being a client…

0:13:30 SA: Like a stakeholder from outside your team. It can be a client, it can be someone in your own organization as a consultant. It can be… Typically it’s a client who wants something analyzed, just as a very stereotypical way. But I think everybody who’s worked with analytics knows the type. So somebody comes in and they have a requirement, they need data for something. And they might have a very broad definition of what they need, for example, “Can you just analyze a campaign for me.” And then they expect you to churn the information for them. Well, you’re completely right, I think, in saying that it needs to be chopped down into pieces.

0:14:07 SA: As a consultant I feel obligated to… And this might be kind of naive, but I feel obligated to not just provide answers, but to actually provide the methodology I use to derive those answers. And maybe not give an answer at all, but just give the methodology. So what… The first thing I try to do with clients is first establish the fact that we have this engine that’s going on, and it’s an agile development engine. We have team members, both from the client and from Reactor.

0:14:36 SA: So our teams are completely co-ed. We have people from both parties involved in the daily basis. And if somebody comes in and tries to disrupt that harmony by adding their own priorities into the backlog, for example, and trying to give me the order because analytics should be done as kind of a reporting machine, the first thing I would try to do is establish the fact that we need a communication thing going on here, and anything the outsider or the stakeholder wants to add to the backlog needs to be added to the backlog in the very same way that any other development task is. So, churn into a use case, chop down into a piece that can be done with a reasonable amount of time and effort.

0:15:17 SA: And the second thing, which I think leads to a more broad discussion of KPIs and analytics, what the purpose of analytics is, is this kind of Socratic method of opening a dialogue with the person and trying to get them to articulate their request in a more meaningful manner. So instead of saying, “Please analyze a campaign,” they probably have something in mind what they need that analysis for. Maybe it’s a new… Should they go with CMS1 or CMS2, or should they go with the feature X or feature Y, and try to get them to articulate that in terms that a developer and an analyst would understand, and then maybe do the analysis with them.

0:15:57 SA: And now we come to this situation where I think Reactor as a company, for example, differs from many others in the consultancy space, is that we found that the only way to do agile properly is to actually infiltrate the client’s premises. So we physically exist in their offices, we have a room there where there are chairs for both our employees and the client team. So we’re with them on a daily basis, instead of, for example, just take an assignment, going to our main offices, working in isolation for a while, and then returning with a report.

0:16:27 SA: One thing I’ve learned as an analyst is that if a client asks me to analyze their data, it’s a very, very strange dynamic, because they are the business owners. They understand the business better than I can ever understand it. And if they ask me to be in charge of their most important asset, which is the data produced by their business, I find it very strange that me, who has no business knowledge to start with, I haven’t been to any business schools, to tell them anything about things like revenue or turnover or profit.

0:17:00 SA: But what I can do as an expert is share with them the methodology I would use to do the analysis myself. At which point, we come to the junction of, “Should they do it themselves?”, where the typical response is, “It’s not part of their job description, that’s why we hired you as a consultant.” And then I just tuck my tail beneath my legs and do as they ask, [laughter] because I still need the dough. Or should I do my best to make myself redundant as a consultant, because I really feel that whenever, for example, a client says, “Hey, thanks for helping us with analytics and I think we’re now ready to hire someone in-house to do this,” I’m really, really happy for them. Because that means they’ve taken whatever little I’ve been able to provide them, they’ve taken it seriously enough to understand that the skillset that I provide is actually quite manageable by anybody worth their salt. And the one thing that an in-house analyst can bring to the equation is the knowledge of the business context they’re working with.

0:18:00 SA: And then just to draw a bridge to agile, is when you have then these resources in-house and you have people, stakeholders in-house who now understand how to pose the questions in a way to get the most meaningful answers, it becomes very easy for them to slip into the agile development process we’ve implemented, because the entire process revolves around transparency, and the ability to chop really large unmanageable requests into smaller parts and pieces. And they’re able to add their own incomparable business knowledge to the mix, which we as consultants will never be able to do, because we don’t live and breathe the company that same way they do. Even though we try to, we’re always gonna be outsiders. And with data, I think it’s so important to have an inside track, when a company wants to be serious about the analytics they wanna do. I’m not sure if I’m answering any of your questions, I’m just going on tangents here.

0:18:55 MH No, I think those things resonate probably with both of us in terms of, in analytics specifically, it is often this combining of two different domain expertise to bring together what the business needs, right? A lot of times, I’ll tell our organization, “Listen, this is where I’m good. This is where you’re the expert. You tell me what you’re trying to do, and then I’m gonna bring you the data and analysis that can start to help solve these problems.” We’re all consultants here, so we’ve all been in that position at one point or other, where you bring, “Okay we looked at your data and we see this, this and this.” And they are like, “Well yes, if you had known these things about us then you would… ” And it’s sort of like, “Well that’s awkward.” So to avoid that…

0:19:45 SA: A lot of times it’s been, “Well, this is why I wanted to have the conversation with you.”

0:19:49 MH Yeah exactly.

0:19:50 TW: Did you really expect me to know that? I am two weeks into working with you.

0:19:55 MH Yeah. Yeah. And that’s how you grow. It’s like, “Okay, so let me pull that thread and pull in the information, so I can understand you better.” But to your point, you’re never gonna understand and own the knowledge of the business like someone who’s been there. And that’s why it’s so important to build sustainability, encourage organizations to build this in there. So I wonder if it’s okay… I’m gonna switch gears a little bit, because I think there’s two different contexts for the actual exercise of agile methodologies in the context of digital analytics, from an implementation perspective. One is the initial implementation, and the other would be the ongoing maintenance in addition of analytics tagging and measurement technologies over time. So I think, maybe just start with that. How would you break those down and what tips would you give to people if they wanted to pursue being more agile in either of those contexts? ‘Cause I think they’re enough different from each other that they might require a slightly different thought process.

0:21:03 SA: Yeah, that’s a really interesting premise. The implementation, the initial steps you take at a client. I think we all do the same things. We start with an audit, if it’s like a take-over. Not a hostile take-over, but we are taking over a project, or a client’s analytics account as a new consultant in the mix for example. We start with an audit and we look at what’s been done before. Maybe if we have a certain template for our own company analytics, or if we are moving from Adobe to Google or vice-versa, we might want to set up the first steps as a carbon copy of what they already have. Just to make sure that we don’t break anything on the way, and we make sure that data quality is comparable before we move on to our new process. And I think there’s very little agile about that. I think it’s a very typical first step. It’s information gathering. It’s analyzing the tags on the site.

0:21:57 SA: Of course, at the same time, if we’re also doing a migration of the technological assets, like the codebase and stuff, we can at the same time do an… That’s gonna be in an agile methodology, so at the same time we can incorporate the analytics auditing into what we’re doing with the development process as well. So we are going over feature by feature. At the same time, we can go over feature by feature for the analytics tracking as well. And the more complex the original setup of course the more work you’ll need for that audit. My first steps in a project are typically quite solo. I don’t even dream of doing anything more complex than just auditing, and doing a verification, just putting down the marks I wanna follow when the actual project begins.

0:22:40 SA: And I think that goes on for a while. If it’s a completely new product, if we’re building features along the way, then it’s a bit different. Because then we can actually implement analytics and be continuous in improvement of workflow directly into our analytics project. Now we’re kind of moving on to tips, how to be more agile in a development team, but that isn’t necessarily agile itself. As an analyst, one of the things we do is, in agile there’s a concept of a definition of done. So basically anything we do, any feature we code, any test we run, any sprint we do has an exit or a definition of done condition, which is typically a list of items we have to go through, and each one of them has to pass with green for that sprint or feature to be successful. Typically, there’s stuff like, in a coding development role it’s like we’ve run our unit tests. We’ve built the tests first, and then we’ve run them successfully. We’ve done integration with these APIs. We’ve done this and this. We’ve done user experience testing. The first thing I do in a project is I add, typically it’s just a bullet that says analytics to the definition of done with the sprint. So whenever a feature is developed, whether it be at the start of a new project or during continuous improvement, whenever a feature is done, the discussion has to be had about what should we track this?

0:24:03 SA: Is this important information that we will be sorry in half a year if we never collect the data from it? And this discussion if left to the developers alone might be quite unfruitful, because the don’t have any incentive to track things that go beyond usage of those features. So, as a software development company of course we do analytics to justify ROI to the clients so we wanna measure that people the actually use the features we create for their services and products.

0:24:30 TW: Can you think of an example of something where it’s not directly the feature? Like the developers wouldn’t automatically say, “We wanna track it, but it should be tracked anyway?” I’m struggling to…

0:24:42 SA: So you mean like a part of a definition of done that doesn’t have to do with the features themselves, you mean?

0:24:47 TW: Right. Well I guess, if I was understanding you right, you were saying there are decisions about what you’re gonna track, and there’s a set of that where you’d say, “Oh we’re adding this form and there is a submit button, so we need to track the submit button.” But I guess, it sounded like you were saying there’s sometimes things that you have to weigh the cost benefit of tracking it. It’s not an automatically track it…

0:25:12 SA: Oh, yeah. Right. Now I get you. Yes. Surprisingly often, we make the decision not to track something and as part of the development process, when we talk about features in a development environment, we’re not just talking about things we see and things we do, we also talk about of course back end improvement stuff. We talk about performance improving stuff and for those, we do track those, but we don’t necessarily use marketing analytics platforms like GA for that. We have tools like New Relic, we have performance and stress testing tools. So the discussion is a bit different. There are some cases… If it’s a very complex site or a tool or an app especially, mobile world is different, it’s entirely possible that we don’t necessarily want to track the minutia of every single feature we create for the platform. Partly because we know that being inundated with data is often kind of anathema for clients who want to be interested in analytics.

0:26:12 SA: When they open up GA for example whose UI is among the worst ever, [laughter] and they open that and see the mess of reports and dimensions and metrics and all these keywords everywhere and there’s even some flash things going on still in this day and age, and they see all these event categories that we’ve been tracking and then they drill down to the… There’s a point, where the really, really infinitesimally small feature tracking doesn’t really add any flavor to anything. We might still… And now we turn to a question what is to track? So we might still do a data layer update about it, we might still add that information to the page as a precaution that we might at some point want to track this, but sending that information to GA…

0:26:56 SA: Just as an example we have one client who we’ve built a pretty sophisticated internal tool for, which they use to drive sales. So it’s a questionnaire that the salesperson fills when they’re interviewing a potential customer and then once they fill the questionnaire, on the basis of the questionnaire, it gives you a quote. And then you can go up and down with that quote if you want to add discounts to it. And there is dozens and dozens of questions under different subheadings and there’s cross-referencing off those questions. So it’s almost like an aptitude test, so there’s the same questions asked in a couple of different ways to see if you’re lying, which is always a good way to approach a new customer.

[chuckle]

0:27:39 SA: At one point we were thinking like “How cool it would be to measure every single interaction in the tool and see how it impacts the actual final pricing”. And if… One of the questions we had was that “If the sales person answers to these questions like this and gets a price of X, will they go back to the questions and modify them so that they get a different price that’s more agreeable with the customer?” We wanted to see if there’s those kinds of correlations going on. But then we realized that, because we were working with Google Analytics at the time, we realized that it would create such a mess of conflicting signals if we were just to measure every single question and interactions. So instead, we opted in our definition of [0:28:20] ____ done, we had this discussion and we opted to instead to just measure generic interactions with a set of questions. “Did the user interact with this set of questions?” We didn’t care what the answer was, we just cared about what the interaction was. So in that case for example we did decide to not go with the granular path.

0:28:37 SA: Now what we could’ve done is use a tool like Snowplow, a tool, which basically just collects data to be worried about later with your schemas and stuff into Amazon database. We could’ve used a tool like that and just collect every interaction, and just enjoy the fact that at least we have them somewhere. And then later on we can build the schemas and do the analysis if the questions rise. But the problem with that is, I think it’s a false notion in the sense that you still have to worry about data integrity at some point.

0:29:07 SA: If you collect all that data, at some point you still have to write the pretty sophisticated algorithm that tries to find the meaningful bits from those that don’t have any meaning. And if you don’t have a relevant business question to start with and now we’re kind of back to the agile mindset. Every feature we code, every kind of KPI we work against needs to be reevaluated every sprint. Like that’s part of the definition of done, we need to know if we’re going in the right direction. And if we just start doing features or tracking stuff with the hopes that at some point, we’ll figure out a good question for this, it’s kind of wasting your time and resources for something that might or might never take place.

0:29:48 MH I’ve never seen this done, but I’ve asked if it should be done before and the way you’re just framing, if you said it’s… Like maybe it’s very very low cost to get all this detail on the data layer. And that you can selectively say, “Now, it’s very, very easy if I want to track it.” Have you or do you have thoughts on sometimes saying, “You know what, this specific question came up, we have not been collecting the data to answer that, we don’t need that much, we need a week’s worth or two weeks’ worth of data, let’s use Tag Manager, our tag management platform, to start capturing that data, we know it’s gonna be messier, more complex, but let’s capture it, but we’re gonna only capture it for two weeks. And then we’re gonna answer the question and we’re gonna turn it back off, because we think long-term that’s cluttering.” One of the examples at Superweek, the IH Nordic guys were talking about a weather integration. And I asked them about it afterwards and they were using it for more… But the way they presented it, said, we wanted to know if, for this client, did it increase in rain, drive, a change in traffic to the site, or conversions or something?

0:30:55 MH And they turn on the integration, they had the data getting pushed in, and they answered the question. And I was like, “Well… ” And in this case they were paying a subscription service to get the data and I was like, “Well, gee you answered the question.” What are the chances that all of the planet is gonna start operating differently in anywhere sometime in the next 20 years, did you leave that turned on?” And it seems like there’s a tendency to say, “Well, yeah, we left it turned on.”. Now in their case they said, “Well, we were using that. We wound up using that integration for some other stuff.” But it makes me think that there could be benefit of saying, “You know what, we’re gonna temporarily… We have a question, we have to wait to gather the data, but that… Companies don’t move that fast anyway, let’s turn it on, deal with that messiness.” Or maybe in this case, you say, “We gonna shove it into… Use the data later to shove it into pay-waker into a different property,” or something where we can say, “We’re gonna keep our core data pristine.” Does that approach make sense? That’s where I’m like hearing part of you what you’re talking through that seems like something that would make sense in some cases.

0:32:00 SA: It makes a let of sense and I think one of the biggest difficulties of fitting analytics into an agile software development model is that data doesn’t work in sprints. The very notion of, “Let’s just measure for two weeks,” is already kind of very waterfall-y in a way. You’ve established parameters before you start collecting the data and you kind of hope two weeks is enough. And the problem with that approach… This is one of the things that I’m still having difficulties navigating around is, let’s say we use analytics for determining, for example, the weather patterns and we get decent data from the period of time we were measuring, be it two weeks or two months. And now we enter the whole big can-of-worms with seasonalities and historical data and is prediction enough or should we also have prediction… Might it be able to simulate history as well, because it’s very difficult to make any kind of decisions without being able to contextualize the data in the business cycles themselves, so how fast does the business move? What is the typical conversion churn rate for customers?

0:33:07 SA: And that’s the big difficulty. How do we… If we use analytics to measure the ROI of our development efforts, for example, when can we say that we are now confident that our future was well established and people are using it? If we create a new form on our website that’s used by millions of people on a weekly basis and only three submissions come in, in the first two weeks, is that a failed feature? Did we do something wrong? Does the data incontrovertibly show that we did a sucky job? And that, I think, is the big difficulty of negotiating data with software development, because they go in such different rates. And it’s like a drummer doing poly rhythms, he’s doing one rhythm with the bass drum and one rhythm with the snare drum. At some point, you know that the rhythms will converge, but you’re anxiously waiting as a listener when will that happen.

0:34:02 SA: And the thing is that with analytics that’s what we’re looking for. We’re looking for… That’s why it’s so important to have it, a feature-by-feature discussion, because certain features just are so rarely utilized that two weeks is nowhere near enough for us to get enough data. And with the form that only have three submissions, for example, it’s possible that those three submissions were very, very good, like very good conversions. Those sessions were excellent and we know that qualitative data can trump quantitative data any time of the day if it’s really, really significant. What I’m kind of advocating here is just to open the floor for discussion and if you want to go the route of agnostic data collection and just gathering stuff and harvesting it for… Because nothing’s as horrible for an analyst as the hind-sight of we should have measured that, yeah, we should have done that. Because that’s a horrible thing to say to the client who’s asking for business-critical information.

0:34:56 SA: But my firm belief is that with this kind of definition of done discussion, for example, the risk of that happening is smaller, because we have that discussion all the time. We’re talking… Analytics is everywhere. In our daily meetings, we have a huge dashboard showing our current measurement points, for example, and the client sees it every day when they come to our team room as well. So they know that data is part of the lingo that we use in our development process. But we still try to keep things manageable with data, so that’s why I’m personally advocating for a more moderate approach to measurement. And just as an interesting tangent from this, a couple of years ago at Superweek, there was talk about this. There was a panel discussion or something and they were talking about, “Should we measure everything and then worry about the questions later?” In the context of big data and big table queries, and Ora Lee Pauls, who’s this privacy advocate, she had a very interesting point that I had never considered before, like data-retention policies, so there’s actually a down-side to collecting all that data. You have to think about data-deletion policies.

0:36:00 SA: You can’t just collect stuff and expect to be able to harvest it for good. There’s all these legal things you have to think about as well. So I think that’s only recently started seeping into talks when people are talking about these huge databases and data warehousing and big data. So that’s also one of things we try to keep in mind. But with Google Analytics, I mean, mentioning big data in the same sentence, with Google Analytics it’s kind of funny, because it’s just rarely clean enough or usable enough for that kind of aggregation. And with Google Analytics, of course, there’s limitations as to what you can collect with PII, of course, it’s one thing, but also with the hit limits and stuff like that.

0:36:40 SA: And as I said, the user interface becomes very unwieldy, and the APIs if you have too much data. There’s things like sampling kicking in, there’s cardinality kicking in when there’s too many rows in your data, for example. We think that, or I’ve come to think that a moderate approach to data collection is typically, the one where you still have most power over data quality itself. When each part of data that you collect is something you’re well aware of and you’ve defined, as well as you can in part of the development process.

0:37:09 MH What’s great, Simo, is as you’re talking, you’re thinking about this on a level that, I think, probably Tim and I have encountered a lot in our careers, but we have not developed a philosophical approach, necessarily. And so, we sort of have these hunches, and it’s really great to hear you enunciate it this way, because for me, I’m connecting these dots, like, “Oh yeah, that’s why this.” When I started analytics, we were doing log file analysis with Web Trends, and if you didn’t create the relationship in the data as you started, you were not able to correlate those data points later in the reporting. So you had to know the end before you began, which as an analyst, I’ve now come to realize, “Well, that’s why that really didn’t work very well,” ’cause how can you know when you need to see what this means to that?

0:38:05 MH And so, those kinds of things are these learnings that you build up over time, but it’s not necessarily the one-to-one that you put that into a framework that you then can leverage in a more sophisticated fashion. And as analysts, that’s where I feel like analytics, and agile, and you kinda touched on this, it’s sort of like this, “How do we fit into this piece?” Because even if I go to implement analytics in a very sophisticated or complex way, how do I plan out sprints, so that I’m doing things in the right order, because in terms of the implementation, I tend to think about it from the perspective of, just do the implementation, but it needs to be chunked out.

0:38:45 MH But there are strategies or reasons why you’d wanna chunk it out the way you do, because one will feed the other, and so on, and so forth. And then, iteratively, and we kinda touched on this, “Okay, well we have this sort of navigation feature that someone’s interested in, well, I don’t wanna measure this forever, so let’s measure it for a period of time.” But what’s the right period of time? Is it one sprint, or is it need more than that? And this also touches on, and I think, for a lot of people A-B testing, because test planning and then fitting that into an Agile model, it can work, but it adds the next layer of complexity. So, anyways, this is very good. I’m sorry, I just went on and on.

0:39:27 SA: Yeah. No, that’s a very good point. The comparison with the A-B testing is just perfect, because we’re working with the same limitations. We’re working with the same kind of paradigm. We’re working with a time box, and we’re working with data that we expect to be significant for our analysis, so we don’t have to actively do CRO or any kind of conversion rate optimization. What we’re actually doing is we’ve validating our features between couple of options that might not have anything to do with the conversion per se, but it’s just about our designers battling which way should we go. One of the absolute best things that comes out of having the analyst be part of the development process is big marketing hoo-ha about tag management systems. The idea that you’re getting a fast track, fast lane past the bottleneck that is IT really results in…

0:40:21 SA: The whole idea of a tag management system is, they claim it’s about reducing friction, and you can finally get things done, when, in fact, that very statement is counter-productive in that it creates friction, because you’re alienating one of the most important parts of your digital organization by saying you finally have a tool that makes them redundant. So, I’ve spend a lot of time trying to… I don’t have any personal political agendas or anything, but if there’s anything I would love to see change in the industry is people just realizing how important a well-informed development unit is to any digital organization. And one of the good things of having an analyst in a development team, it’s a two-way street, is first of all, the team itself learns to respect the fact that everything they do turns into data, and that data can be used, for example, to hire more developers, or to let them go, or it can be used for things that are very, very everyday and important to the developers themselves.

0:41:19 SA: So they learn to appreciate the fact that the client might… They may not care about the data, but the client certainly does. And it’s a perfect way for them to also prove that they are meaningful to the client. And the very fact that we have a big dashboard in our development room and our daily meetings start with talking about data is turning the developers more and more informed about the requirements of analytics, which then, in turn, turns into a development decision for them, should they go with framework A or B, when A is more beneficial to tracking, for example.

0:41:49 TW: I had that light bulb go off, or go on, I’m probably portraying that, [chuckle] five or six years ago, when I realized that… Well somebody pointed out who was a developer and said, “Data collection in jQuery, Java script is not the sexy, cool thing that developers are drawn to, which I think is valid, but I also realized that as an analyst, my deliverables were going back to the business. And it was this very easy thing to do to take what I was already delivering to the business, take the extra 15 minutes, and go over to the developer and say, “Hey, you know that thing that you had to figure out how to… This non-sexy kind of crappy thing you had to figure out to get this tracking in place? That enabled me to do this, which the business was really excited about and it’s providing us this information.”

0:42:35 TW: And that was… You could literally look at the developers, you have that conversation once, and they are infinitely more receptive, ’cause that loop doesn’t get closed. So, I think there’s a part where you say, one of this ideas, you’re talking about a lot there’s much more tighter linkage on an ongoing basis, as opposed to more the waterfall. The business is involved on the requirements, we then do the development and then the analyst goes and analyzes it and returns it to the business and it doesn’t close the loop. And that’s just a way to make… It’s a people aspect, there’s a relationship aspect. But it’s kind of, “Hey, we’re all in the same team, break down the silos.”

0:43:13 SA: I think that the feedback loop is important here, and what you basically did was that was a very agile and mature thing to do is to, because you had what we call in the industry as a retrospective with the developers. You talked about what was done and what it’s impact was, to both your work. The one thing that I wanna pick up on that, as well, is that like I said, it’s a two way street, so one of the benefits of having the communication open with the developers is also you as an analyst become more sensitive to what the developers are doing. And you learn to appreciate the fact that there’s a reason why they might be considered bottlenecks, there’s a whole mess of governance going on. And there are ways to navigate around that if you find the common goals and when you start talking about it, you realize that, “Hey, those are my goals as well.” So I get the feeling that this whole bad will between these so called marketing and so called IT like these two warring factions, it’s just a myth propagated by consultants who wanna sell their services for more money and they want to keep this alive this kind of a friction, because it’s a very…

0:44:16 TW: Okay, now, you can’t let these secrets out, we’ll have to cut this out. [laughter] So we continue to be the ones who swoop in and…

0:44:23 MH Yeah.

0:44:23 SA: No, I just want to…

0:44:24 MH They’re born enemies and only we can broker the peace between them, Simo.

0:44:30 SA: I feel like Edward Snowden right now. I’m whistleblowing the industry.

[laughter]

0:44:37 SA: The truth hurts guys.

0:44:39 TW: No, it’s good, and yeah, I have a lot of pet theories about why that happens, but your point is well taken, it’s a fun thing to say in a sales process, look how easy it is to tag your website, but in actuality when you go to do the work, having IT as a partner in that process is gonna be more benefit than harm.

0:44:58 SA: Absolutely.

0:45:00 TW: And that’s the reality of doing it in the real world.

0:45:02 MH So, one of the things Simo that we did at Super Week a little while back, is we asked the audience since you weren’t able to be there this year, but you are well known and loved there. What questions did they have for you? So, we made them a part of the show, and so, Tim, let’s bring in some questions from Super Week.

0:45:23 TW: Okay, maybe we can alternate, first introduce Ivan Rečević, I am sorry Ivan, you can probably pronounce his name better than I can. So, we know that you had had discussions in the past, so we will now have Ivan ask his question.

0:45:40 Ivan Rečević: So, asking me last year, so we were discussing English literature.

0:45:45 TW: Okay. This is gonna so awesome.

0:45:49 IR: Okay, so he was into science fiction and he said, he read everything. So ask him Robert E. Heinlein, “A stranger in a strange land.”

0:45:58 TW: I’ve read that in full.

0:46:00 IR: Yeah, but he didn’t read it, ever.

0:46:02 TW: Ooh. So we’ll ask him if he’s read it, yet.

0:46:06 SA: Not yet, but thanks again for reminding me. [chuckle] It’s definitely on my Amazon wish list, I will have to do a better job with it.

0:46:14 TW: Is that was absolutely when I point to what was probably my favorite sci-fi book as a child, and I like to drop in grok in full.

0:46:24 SA: Well, I’ve had lots of great book recommendations from Super Week. I started reading the foundations series from Asimov after I talked with Matt Gershoff, for example. And seeing Daniel Waisberg’s sideburns, so there’s a [laughter] definitely lots of good book recommendations in that conference, on top of all the other good things.

0:46:46 TW: That’s funny, on the flight, while I was still there, thanks to ebooks, Astrid Illum recommended Kim Lee Masterson, I’m butchering his name.

0:46:55 MH Kim Stanley Robinson, I believe.

0:47:00 TW: Close enough.

0:47:00 TW: Really?

0:47:00 MH Yeah.

0:47:00 TW: Okay. Not good, but I did get Aurora and I did… [laughter] It was interesting, ’cause it was right as I was touching down in the US that one of the first sequences in that book, when they actually arrived at the planet, so I will absolutely take book recommendations from fellow analysts, I’ve not gone wrong yet.

0:47:19 MH All right. Well, the next question was from Ophir Prusak.

0:47:25 Ophir Prusak: So I’m curious, Simo, if you could change GTM in any way you wanted, what would you change about it?

0:47:34 SA: That is a very difficult question, because I’m so easy with GTM and I’m so forgiving to many of its flaws that other people might see, but I refuse to see them. I think I would turn mobile, or put focus on mobile tag management, so the mobile world a bit more than it is now. I know they’re doing a lot of work with fire base containers and stuff like that. But personally, I would put it as one of the top priorities if I were in charge of the development team, I think there’s a lot of ground to be covered over there and I think there’s also a lot of work to be done to figure out what tag management even means in the app world. It’s not self-evident, because there’s no such thing as a tag in the app development space.

0:48:17 TW: So mobile specifically, the native app side of mobile.

0:48:20 SA: Yeah, exactly. And also actually the hybrid part as well, because web apps aren’t exactly conforming to the frameworks that mobile devices work with, so there’s a lot of things to be done over there.

0:48:33 TW: Yeah. That could be a show all to itself right there.

0:48:36 SA: Yeah. Absolutely.

0:48:37 MH So Doug Hall who was in the episode and we’ll play what he says first, and then I’m gonna pivot it into a question. So here’s Doug’s question that wasn’t really a question.

0:48:49 Doug Hall: Yeah, just real quick, I’m happy to be the custodian of the ward now, and I know longer come second to Simo, as I did every bloody year before.

0:48:58 TW: Yeah.

[laughter]

0:49:01 TW: We’re going to turn that into a question. Take that Simo.

0:49:05 TW: And now to make that a question, just how much are you torn up by the fact that you did not own the Golden Punchcard due to your shameful lack of attendance, do you think you could’ve won it if you’d been there? That would make you sound like an asshole.

0:49:18 SA: Of course not, I would never have won it if I’d been there. I’m very, very, very happy for Doug, I think he deserved it so much. He’s been doing… Himself, and Conversion Works, have been doing a great deal of outreach and any company that takes the time to write blog posts and create tools for free for the general public just in terms of knowledge transfer always gets top marks from myself. And Doug is just an outstanding fellow and I’m really, really happy that he won the prize. Maybe next year, we’ll have a new showdown. There’s always lots of great solutions and I know Caleb was there presenting again, he’s always had good stuff as well up his up his sleeve.

0:50:00 TW: Yeah, Caleb did a GTM driven survey where it was like, get the survey and shove the data into GA.

0:50:06 SA: And the amazing thing about a word itself, or the competition in a way, is to see just how much people are doing stuff out there that no one is talking about. There’s been some really, really amazing little, in-house API stuffs that people have been presenting that would, if you were to monetize them, you would get a crazy amount of subscriptions just because they’re so ingenious. But, the very fact that we now have a venue to display them is something I think other conferences should pick up on as well. Just give nerdy analysts a chance to show their tools on projects they’ve been working on.

0:50:41 MH Yep. And for reference the Golden Punchcard Award for our listeners is if aspects of Super Week where people are encouraged to bring solutions and it’s voted on there and the winner is presented with the Golden Punchcard, so it’s a really cool sort of showcase, as we’ve been discussing, a sort of neat tricks, tips, technological advancements with the tool, so, just giving it some context. Alright, our next question was from Kayleigh Rogers.

0:51:12 Kayleigh Rogers: What do you wanna learn about in 2017?

0:51:14 TW: Ooh, also very good.

0:51:18 SA: Well, this is one of the reasons I’m so happy to be at Reactor is that it’s the kind of an organization where there’s not really no vertical advancement, there’s no ascension up the corporate ladder or anything like that. The ways in which we precede in our careers is by learning new stuff and there’s always venues open for learning new stuff and I feel like I’ve learned in my two years at Reactor now I’ve learned more about development, and project work, and consultation than I’ve ever learned anywhere else in my career. And in 2017 I’m just gonna try to learn new languages, new programming languages, try to learn new development methods, try to learn about project development. I’ve been building my own tools for a long time, for analytics and GTM, for example, and I’ve learned a lot about how to write software correctly after my work with Reactor, and I think it’s something that I plan on just learning more about, like how to do software projects. I think it’s the career path I’m most excited about right now, is to become a better developer.

0:52:21 MH That’s awesome.

0:52:22 TW: It’s funny, I felt like I was… At Super Week one of my presentations I ended it with sort of a imploring the analysts to pick something specific and ambitious and say, “I’m really trying to move towards that”, and I think that’s somewhat driven by working with a lot of analysts and seeing the ones who are saying, “Well, I’m just trying to learn how to answer the next request that I get and I might have to go do a little research,” and I think that’s challenging. I think almost self-selecting, anyone who’s listening to us drone on is kind of interested in pushing themselves, so good on you. The same thing is for Super Week, there is not anyone there who is not saying, “I’m excited about this and I wanna push myself,” but I feel like there are too many analysts who are not.

0:53:09 SA: But complacency is a difficult thing to fight against. If you have no incentive in your professional life to let that leak into your own hobbies and your own interests, it’s a very difficult dynamic. There’s a saying that “You can only be a good developer if you actually love to code”, so you actually have to want to do it as a hobby, you have to want to be interested in self-development. I think it applies to any discipline…

0:53:31 TW: Analytics.

0:53:31 SA: Analytics as well, definitely. You have to love number crunching to have the incentive to become a better number cruncher and you have to love implementation to become a better implemen… So I think it’s fighting that plateau of complacency is the biggest obstacle that anybody has, and I think as figures who are doing public outreach like with this podcast, and with blogs, and books, and everything, I think it’s part of our obligation is to help people breakthrough those plateaus and find their inspiration.

0:54:02 TW: Wow. So we could end on a nice feel good moment, but instead, we have our final question from someone where it sounds like I’m assuming you guys have had this discussion many times before. Since you’ve been a little prepped for it, maybe you’ll be able to smack down. But we will go to the Googler herself, Krista Seiden for this last question.

0:54:24 Krista Seiden: So Simo, I know this is gonna be your favorite question. Even though you don’t want to, how would you build a visual tagging tool into Google Tag Manager?

0:54:33 SA: How is that for a loaded question. [chuckle] We had some really interesting discussions when we were both at the Loves Data Conference in Australia. And I think we come from a… Our approaches are different. I try to… Maybe not different, but we have a… I try to promote a more technical approach to analytics, and I think Krista is more invested in analysis itself and being an analyst as a user off the data itself. And I think one of the talks we’ve been having is about GTM and for example, I would love GTM to have the best API in the world, and I would love them to focus on mobile development. I know this is very technical nitty gritty stuff that nobody else really cares about, and one of the top things we were talking about, should there be a visual tagging tool for example, where you have a website and you see the website and you click an element and it creates a trigger for that element. You don’t have to do anything else. You look at… It’s like a WYSIWYG editor in a way.

0:55:38 SA: Ironically, I think Google is already there, they have Optimize. Optimize has a visual interface for selecting elements and modifying them. The tech is already there, so I think if Google wants to pursue creating a similar helper tool for Google Tag Manager, all they have to do is bring a couple of technologies together and create a Chrome extension or whatever it requires. But I’ve definitely been twiddling around with the idea of some kind of way to create CSS Selectors for example with just by looking at the side and clicking at elements. But I always stop doing that when I figure out that it’s just easier to learn how CSS works and write the damn selectors yourself.

0:56:16 TW: I do think it was helpful with Optimize when I was first playing with it, you could use the visual tool, but then it would show you what the jQuery was. And you could tell that, “Oh, this is sloppy, it’s just physically moving something.” So it seemed like a good tutorial. Somebody pointed out the dangers when somebody says, “Oh, that’s what I’m gonna use for my test,” as opposed to say, “This is my starting point, and now I need to lock that in, so that it’s a little bit cleaner.” I’m realizing the very first time I ever saw the back side or the inside of a Tag Manager was Michael being at my house and showing me satellites, visual tagging thing. So I just realized that just took me back a few years.

0:57:00 MH Yeah, that was four years ago.

0:57:02 SA: Yeah. This is actually the fifth year of Google Tag Manager, I think October is the celebration, so it’s interesting how far we’ve come. I think one of the dangers of these visual tools is that with every passing year the complexity of the web pages increases. All these frameworks, these single page apps we have, dynamically loading components that don’t really conform to the typical page load process, with frameworks like React from Facebook, which is very popular. Framework for building sites. I think it increases the risk when you’re only messing with the DOM, the Document Object Model with these visual tools. It just increases the risk of you creating a dependency chain that’s broken by a single future update in the next sprint. So, that’s one of the reasons I personally shy away from them and hope that more and more would be done through the data layer for example, just to kind of have it fixed. But for prototyping, it’s an excellent tool for just moving the elements around in the Optimize editor for example. It’s really interesting to see how it works. Then you can deliver the final draft with screen shots or whatever to the developers and ask them to do exactly that, but with better code.

0:58:10 MH Yeah. That’s the same way our practice has developed around this as well, it’s interesting. Those are the questions we had from Super Week, but I am sure as you’ve been listening, you may have come up with some questions of your own and we would love to hear from you. The best way to reach all of us is probably through the Measure Slack, also through our Facebook page and of course our website analyticshour.io. If you’ve been listening today, congratulations you are now a certified scrum master. [laughter] And you just send a self addressed stamped envelope to PO Box-75733, Burbank, California, 91522. And you will receive your certificate in the mail in six to eight weeks. That’s all not true.

0:59:00 SA: This is exactly how in all TV shows litigations starts with.

0:59:02 TW: Yeah exactly.

0:59:02 SA: When somebody says something like that, shit like that and the Scram Alliance sends you a lawsuit.

0:59:07 MH Yeah. All of that is not true, [laughter] but hopefully, what you’ve learned in the show is some of the value of thinking in a more agile fashion. Simo, it has been a great pleasure and honor to have you on the show. We certainly learned a lot and I love the way that you are now enunciating for a new generation of analytics people a way to be thoughtful about this in ways that maybe Tim and I aren’t able to say, ’cause we’re not developers, honestly. And…

0:59:38 SA: You’re not developers yet.

0:59:40 TW: Yet, yes, yeah.

0:59:41 SA: It’s a matter of time in this industry.

0:59:45 MH Hope for me is fading fast.

0:59:48 SA: Okay so that’s my objective for 2017 is to turn Micheal into a developer.

0:59:52 MH Oh my gosh all right. Just sessions twice a week, let’s get going.

[laughter]

0:59:58 SA: Okay.

[laughter]

1:00:02 MH There you go. Well, I think that just made a bunch of people jealous. Anyways, for my co-host, Tim Wilson and myself, thank you so much. And if you’re out there listening, remember keep analyzing.

[music]

1:00:19 Announcer: Thanks for listening. And don’t forget to join the conversation on Facebook, Twitter or Measure Slack Group. We welcome your comments and questions. Visit us on the web at analyticshour.io, Facebook.com/analyticshour or at Analytics Hour on Twitter.

1:00:39 Charles Barkley: So, Smart Guys want to fit in. So they’ve made up a term called ‘analytics’. Analytics don’t work.

1:00:48 SA: I mean I guess we all get those kinds of requests. But [laughter]

1:00:53 TW: Ah sure.

[background conversation]
[laughter]

1:01:04 SA: Oh my God.

1:01:07 TW: I appreciate that while you have a Scandinavian name, it’s actually one that I don’t think I’d butcher too horrendously when trying to pronounce it. Again, the listening skills, Michael. The listening skills.

1:01:19 MH It’s really early in the morning.

1:01:20 TW: It’s important if you’re actually working with people, you need to learn to listen. You said it was funny watching Michael and me bicker like siblings for an entire week.

1:01:33 MH: Oh, we are a couple.

1:01:34 TW: There is the genuine affection there, but there’s also, the let’s go after every little…

1:01:42 MH: I don’t know. I felt like the genuine affection, until like after you were gone.

[laughter]

1:01:56 MH: So we’re so cutting that out of the show.

[laughter]

1:02:03 TW: Rock, flag and scum…

[background conversation]

1:02:07 TW: Rock, flag and scrum mastery.

[music]

 

2 Responses

  1. Boy I never thought my name would be butchered like that!
    Ora Lee Pauls, seriously?

    Yet happy I made the great Simo Ahava think about data retention periods, hah!
    I think that’s partially settled with limited look-back windows anyhow and (sometimes debatable) anynomization engineering by our preferred tools.

    Maybe fines of 4% of global turn-over or 20 million euros with the upcoming GDPR could allow us to move the conversation slightly further, certainly if considering agile environments?
    After all if risk is increasing, vigilance and hopefully some kind of governance could be embedded as well.

    There are a couple of Rights in the GDPR text (here http://eur-lex.europa.eu/eli/reg/2016/679/oj, read it) we might want to start thinking about, to rely less on what Google tells us we can do (in terms of PII sic) and possibly start thinking for ourselves, when addressing EU clients (see territorial scope, article 3 for starters).
    Stuff like
    – the Right to Object to Profiling (article 22);
    – recital 30 talking about cookies and unique identifiers (pseudonymous data, a new category where privacy obligations apply, see article 4 concerning definitions and you might to revisit the notion of Personal Data as well);
    – the Right to Access by the Data Subject, article 15 is a funky one as well,
    – on top of rectification, erasure/deletion and consent are things the digital analytics industry should start to think about.
    Deadline is May 2018 so less than 450 days. Don’t say you haven’t been warned 😉

    Not to mention the upcoming ePrivacy (now) Regulation, the one about the cookie walls we all sweated on some years ago, which is being re-discussed after I got Slashdotted for discussing this back in 2009 while working with Eric at Web Analytics Demystified: https://yro.slashdot.org/story/09/11/13/1348222/breathtakingly-stupid-eu-cookie-law-passes.
    Possibly time to resurrect those thoughts that were deleted as imho, they remain valid.
    Why? because conditions for consent for example with ePrivacy do not include “legitimate interest” unlike the GDPR.
    So remember those cookie cliffs? they are once again just around the corner but with a bigger stick: 4% of global turn-over or 20 million euros, which ever is higher. Quick calculation for Apple for eg. that would be around 10 billion, I think it was $. No wonder they ask about what Data Privacy Impact Assessments (DPIA) might look like! Article 35 on DPIA is one to visit as well, certainly in light of agile developments 😉

    Thoroughly enjoy your podcast guys, keep it up
    Kind regards from Spain,
    Aurélie

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

(Bonus) Marketing Analytics Summit is Nigh!

(Bonus) Marketing Analytics Summit is Nigh!

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Bonus_Episode_-_Marketing_Analytics_Summit_Is_Nigh.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares