#090: A New Paradigm for Privacy? with Sergio Maldonado

June 5, 2018

Put this in your pipe and smoke it: all of the tracking we try to do of people is actually technology designed to track content. And, even that tracking of content was a hacked-together repurposing of a system designed to deliver content. In other words, we’ve got layers of fiction upon fiction that we’re trying to muddle through (and, often, ignore) as an industry. The result? A ridiculous level of inefficiency whereby brands overspend to ineffectively reach their target audiences with direct response messages, and well-intended intermediaries grow their bank accounts. Ugh! On this episode, the gang invited Sergio Maldonado from PrivacyCloud (and, by day, from Sweetspot Intelligence) to chat about the broken environment we’re operating in, as well as how GDPR and financial considerations may just force us onto a path of shaking it up!

References from the Show

Episode Transcript


0:00:05 Announcer: Welcome to the Digital Analytics Power Hour. Tim, Michael, Moe and the occasional guest, discussing digital analytics issues of the day. Find them on Facebook at facebook.com/analyticshour, and their website analyticshour.io. And now, the Digital Analytics Power Hour.

0:00:29 Michael Helbling: Hi everyone, welcome to the Digital Analytics Power Hour, this is Episode 90. You know, long before GDPR, General Data Protection Regulation, in case you’ve been stuck in a well for most of 2018, people were still concerned about privacy. But how do we protect privacy while creating opportunities for businesses? This is a serious show with no jokes, I mean none at all. Because we’re talking about privacy and slightly more important, business and a paradigm for protecting privacy that could still impact business. Obviously, let me introduce my no nonsense co-host, Tim Wilson.

0:01:14 Tim Wilson: Hello, Michael.

0:01:17 MH: Hello, Tim. And my other very serious about business co-host, Moe Kiss.

0:01:21 Moe Kiss: Hi, Michael. I can’t even. [chuckle] I can’t. It’s just… I can’t be serious.

0:01:28 MH: We’re not robots. We’re just serious, it’s about being… Yeah. And I’m Michael Helbling and we’re excited about this show, but we needed a guest who could reflect the gravitas of the moment since we obviously cannot. Well, we found someone who just happens to be right at the nexus of these two business moments. Sergio Maldonado began life training as an attorney specializing in ePrivacy. After starting a law firm in London in 2003, he started Divisadero, which was an analytics boutique in Spain. And then in 2012, he started Sweetspot Intelligence, but it is his latest startup that got us really excited to talk to Sergio right now. Because this year, the serial entrepreneur started Privacy Cloud. Sergio, welcome to the show, finally.

0:02:21 Sergio Maldonado: Thank you, happy to be here.

0:02:24 MH: Yeah, it’s great to have you on the show. Having known you for many years, this is a long time coming. I think what would be great as a starting point is to maybe back up and give people a little bit of perspective on what Privacy Cloud is, what problems you’re attempting to solve with it. And then let’s use that as a vehicle to jump into the topic of privacy and how to engage with it in our current time.

0:02:56 SM: So what’s Privacy Cloud? Privacy Cloud is about getting out the middleman. We have consumers and we’ve got brands, and there’s all of these creatures in the middle, and we believe it is time for many of them to get out, to disappear just because, first, it is not working, and second, it is illegal. That’s Privacy Cloud.


0:03:21 TW: So what does that mean? You’ve written a lot on it, you’ve got this awesome bar analogy, which I had to mention so that I’ll have an excuse to link to it in the show notes. But as a brand, I want to reach teenagers who are living in New Zealand without a middleman that somehow got a pool of teenagers in New Zealand. The teenagers in New Zealand aren’t necessarily saying they want me, this brand they never heard of to be able to reach them. So, what’s the high level without having some intermediary? How can that happen?

0:04:01 SM: Yeah. So, we may need Moe for this one. But assuming there’s about a million or half a million teenagers in New Zealand, how many could we expect?


0:04:14 MK: I’ve got nothing, I’ve got nothing.

0:04:15 SM: Alright, I’m sorry.

0:04:15 TW: You’re geographically closest and age-wise closest.

0:04:19 MK: I can Google.


0:04:22 SM: Yeah, so Tim, you’re trying to address half a million people. And to be able to do that, you’re gonna have to measure billions, and you’re gonna have to then shoot at millions just so that you can get to your half a million. So, we’re not saying it’s not working, it is working, but it is just not efficient ’cause you’re spending so much money to, first, measure everyone that you possibly can and then to shoot at many more than those that you’re targeting. That in the end we think, if you took that money and you try to give it directly to that half a million people, to your teenagers in New Zealand, then they’ll be better off. The problem is, how do you know where they are? And how do you get to them? But if you bear in mind that half the budget, half the brand’s budget is just being retained by all of these brokers and that there’s then the ad fraud, and then there’s all these concerns about the metrics that we all know about then what we’re trying to do here is, what if you didn’t target millions just to get the half a million? And what if in the process of targeting those millions, you didn’t have to gather data in stealth mode about billions?

0:05:40 MK: Is the fundamental issue more about the customers and what’s in their best interest? Or is it more about the brands and how they can be more efficient? Or is it a little of column A and a little of column B?

0:05:58 SM: Yeah, like that one, yeah. I think that’s the answer. The reason why I’m here is that I believe marketing is not working. And in the digital space, I’ve been measuring it for a long time. And I know certain things work, I know measuring helps. So I don’t think we should commit suicide just yet. But what I’m saying is that it is not delivering on the promise. So that’s the first thing. And then secondly, we are in the process of trying to get there. We are squeezing the system and the environment that was never built for people, which is the internet. And we’re trying to address people through, again, this world that was built for documents originally. And I can go on forever about that one. But the essence is that we’ve been building all these fiction to try and get to people. And that means we are just gathering as much data as we can. And then in the end, not even like this, we get it to work.

0:07:01 MH: Yeah.

0:07:02 TW: So that’s like web analytics. Sorry Moe, I’m using the old web analytics term. But that is basically a… [chuckle] She dinged me about that in the past. That’s data collection is a hack on top of how we delivered content. And this is, you’re saying another… We’re trying to take that same hack, extrapolate it out to cover 100% of everybody interacting digitally, which is the same thing. And you’ve talked about this as well when doing attribution, it’s a similar thing. You say, I have to capture as much of the population as possible so that I can hone in across channels, across all these people, the ones that I care about. And you’re just saying that’s an incredibly inefficient way to do that, which means an enormous amount of money gets poured into intermediate systems that are all promising that they’re going to cover the whole population so that you can then get to your little slice. And along the way, people are freaking out about that legitimately. And that’s what’s led to the ePrivacy Directive in GDPR and that sort of thing. So you’re just saying, this whole thing is a hot mess and we’re throwing a lot of money away.

0:08:14 SM: Yeah, exactly. It’s a hot mess. And what’s been happening, if you rewind, and you look at things that we’ve done in the past, you realize that we’ve been building this fiction one brick after another so that we could talk about people in this document-based environment. So you know HTTP has always been about requesting a document and getting a response, and then the connection is dead. So there’s this stakelessness that we know we’ve been fighting against for a long time. Then in 1994, we got cookies, and we thought, “Oh, we’ll create a session.” And they weren’t built for us to talk about visitors, they were built for us to have a shopping cart. So, to create a first layer of fiction, which is, let’s create stakefulness and let’s connect data across different HTTP requests. And in fact at the time, the discussion when the cookie standard came out was, let’s ask people before we store a session cookie ’cause it should also only be there for the length of the session. That was the first fiction.

0:09:25 SM: And then, there was a part of the fiction, it sort of transpired or it sort of found its way into analytics. So then we created the analytics session. And maybe you, maybe Michael, I don’t think Moe will remember when we started building sessions and in the Webtrends world, we would call this thing sessionizing. And it would be about wrapping all of these HTTP requests around some of the hits of course, or paid to use if they were hits that touched on a particular document type, then we will have the session. And you could already see what kind of fiction we had. We had a session fiction on the application server, so that could be maybe 20 minutes. And we have a visit fiction, which could be 30 minutes in Webtrends at the time. And we kept building on top of that. Then came the visitor. Then came lifetime value built on the visitor. And the fact is, we’ve been stretching or overstretching this system to be able to talk about people because by building different layers, we get to the point where we speak a language that the marketing or the marketer, or real marketer, non-technical marketer as the dreamer would understand speaking that language.

0:10:40 SM: And so, by building that, we go to the point where it’s a bit like subprime mortgage, or maybe I’m exaggerating. But there’s all of these layers of fiction underneath, and no one wants to really hear what’s happening and we keep adding to that. And then there’s attribution, and then there’s multi-touch, and then there’s all this trust placed on algorithms and plenty of things that we do in terms of statistics that really are good and really work. But the foundations are so weak that at the end, one of the things that we’re doing is rewinding and thinking, “Look, this was built for documents. When you tack, you do not tack people’s brains. When you’re building a data layer in the end, you put that on pages, not on people’s heads. And it was never built for people. So maybe Facebook is. But the internet as a whole was never built for people, it was built for documents.

0:11:32 MK: But does any technology really evolve with the end outcome being what it was necessarily started out for? I mean, yeah, I do obviously have a really different perspective on this, but the fact that it was designed for documents doesn’t mean that it hasn’t evolved to be useful in different ways.

0:11:55 TW: I think the internal combustion engine pretty much evolved for locomotion and worked out pretty well. I think there are plenty of technology examples that have worked, but…

0:12:03 MH: Whoa! And that it goes…

0:12:06 SM: You’re right. You’re right.

0:12:07 MK: I’m sure you’ll remind me.

0:12:07 MH: It goes back to this concept. We created a model by which we could infer the behaviors of an actual person based on their interactions with these documents or the content that’s flowing through the internet. And I think you bring up a really great point, Sergio, is I think it’s really critical for us all to remember what it is that we actually are gathering, as opposed to what we convinced ourselves that we’re gathering. We don’t actually see the person through it, we see a representation. I use that as example. You know that little toy that has all the pins and you put your hand in it, and it takes the shape of your hand or your face or whatever? And I use that when I talk to university students because that’s how we see the customer through the data, all these data points. We think we see them, but in actuality, are we sure? How do we be sure? And how do we understand and represent the data the right way through all of those different data points? So it’s a little bit of an analogy, but it’s similar to what you’re saying, Sergio, which I really like.

0:13:22 TW: But the flip side is that if Michael, you, and Moe, and I all did that same toy, and then we put it in front of Sergio and he would say, “Ah,” and we showed him one of them and he would say, “Oh, I see that’s Tim,” then I freak out that that mechanism that we’ve used has given him visibility into me, so it’s on the one hand, the marketers are unsatisfied ’cause the more we recognize that it’s an imperfect view but at the same time, we have real human beings getting very worried that we’re seeing enough.

0:13:56 MH: Well, and…

0:13:57 TW: They’re concerned about privacy.

0:13:58 MH: We push, and we push, and we push deeper and deeper to gather that data just in case. And this just goes back to the philosophy of privacy generally, which is even if I don’t use the data I capture, is it okay to even capture it in the first place? And that’s at the heart of privacy any time you’re talking about it. And so, in the United States, there’s all these laws about how the government is allowed to capture or not capture your data, and we all have these concerns about it. And I think GDPR is bringing to a head this concept for a lot of us. So, I think it’s a good time to talk about privacy.

0:14:40 TW: That’s the other fiction that it’s even possible to guarantee that we can capture and never use, right?

0:14:46 MH: Yes, exactly.

0:14:47 TW: Nobody’s buying that anymore.

0:14:49 MH: Right. Because, I mean, that’s sort of… Yeah.

0:14:52 SM: Yeah. And since you mentioned GDPR, I think there’s a similarity there in that in GDPR, everyone is talking about it and running webinars about it. Everyone now knows plenty about privacy, but it’s very, very hard to apply it to yourself. Really, really difficult to apply it to your own business. And something similar happened with attribution and all these things that hang on the single customer view, where I keep seeing people on stage telling us about the wonders of these things. And when you scratch the surface, I have never met anybody who’s really got to the single customer view. And why do I raise this thing now? Because I think that in the end, that’s the bottom line. I think marketers started doing this by accident. So, you’re saying things evolve and it’s totally true. Alright, we started with documents. And what you’re saying is, we’ll get two people. What I think is that sometimes we start by accident by a pure random chain of events, and sometimes we realize that’s a wrong start. It’s a false start so we need to start again.

0:16:00 SM: And so what I’m arguing is that the ad-supported internet is a false start, and that it’s built on so many assumptions and so many fictions that we need to go back and rethink. And about that one, what I’m also thinking is that marketing has evolved a lot. It is true that at the beginning, it was all about telling people what works and what doesn’t, and then marketers saw that there was this promise that money would be more efficiently distributed in terms of ad spend. But as we moved on, why is everyone now asking for the single customer view? Why is everyone asking for ROI or this direct one to one relationship with the customer? And I think the reason why that’s happening is… I would say there’s two reasons. There’s on the mindset of the engineer or the techie working in marketing, so the martech professional, this is just another process and the natural next step to what we have been doing is just gather more data and close the circle. Because if you look at it from a technical perspective, it’s “Are we making progress? Let’s get to the next step, which is give me all about people.”

0:17:17 SM: But if you look at it from the point of view of the marketer in the sense of the dreamer that I’m calling, which is someone that never really understood the technology, but was promised all of these things and he’s hoping that something is gonna happen, they get high on each other. One of them is building projects that do have an outcome, it does seem to work. And the other one does build a dream around it, but they never really seemed to coincide. And so, the dreamer now realizes that as there’s more efficient information flows, that you do not need to generate demand so much. That the world is not so much about generating demand and maybe not so much about building brands, although of course there’s plenty of room for that. And not so much about even tapping on intent, but instead it’s getting closer to demand being there and you understanding that it’s there. So listening for demand and if marketing evolves in that direction, so towards this demand-led world, then I guess the natural evolution of the dreamer is towards the techie marketer, if I can say that. Meaning that, what we are seeing now when you go to Adobe Summit and there’s 10,000 people, and all of those 98% are techie marketers, not dreamer marketers.

0:18:37 SM: Then, what I’m thinking is maybe this is the natural evolution that because now marketing is not about generating demand but about knowing when people need something, ’cause they will know in a more efficient way and they will know what can serve a particular purpose, then we need systems to connect with them. So I really think that the pursuit of the single view of the customer is not just a technical pursuit, a technical endeavor that we need to do because it’s the next natural step. It’s also something that we need because marketing is going in that direction, if that makes any sense.


0:19:13 TW: Well…

0:19:16 MK: I think so. There’s been quite a few ideas there and everything’s rolling around in my head as I’m trying to digest. From my perspective, and what I feel like the industry is moving towards, and maybe everyone also has a different perspective, from a single customer view is actually about the customer. Like, it’s not about just collecting more data so we can close this loop, it’s actually about, if you don’t have a good understanding of your customers, you can’t give them the features or the notifications or whatever it is that they need at a particular point in time. And the thing that I’ve been noticing that I’m really loving about our industry of late is that people seem to be weirdly, especially I feel like with GDPR getting it is that the tradeoff has for too long been, give us your data and you don’t get shit back for it versus now, give us your data but we will actually give you a better experience. Like, that tradeoff is becoming fairer and it’s more in favor of the customer versus just, I’m not a big believer and just like collecting everything in case you need it. Oh, my god, everyone wants to talk. Go.

0:20:32 TW: Well, I mean I think I applaud your optimism and your crediting, your idealism and crediting the industry with. I believe you just articulated the pitch that we’re trying to tell ourselves we’re making that by… And there’s one of those kind of fictions sitting there. At the end of the day, once we’re seeing you on two channels, like the number one way we love to be able to link consumers across multiple channels is so we can retarget them or across different. And that’s not, while yes, there is some level of saying, “Oh, I’m seeing an ad for something that I looked at earlier. It’s kind of its simplest world.” More people say that’s annoying. I looked at it and now you’re just hitting me in the face, reminding me that I left something in my cart, reminding me and to say that, “Oh, but it’s for the good of the consumer.” I don’t know. The higher ideal that if we provide more relevant and targeted and more appreciated content, they will buy more from us. It’s true, but I think we’ve gotta be pretty honest about what’s really going, like it’s effective but it’s also annoying, I guess.

0:21:51 SM: Before you even get into what people expect in terms of this new trend for people to own their data, and even before you get into the law, like the law is the answer to that social unrest, or maybe first in Europe then whatever it happens but even before you get there, I think it’s core to analyze whether what we have today, even before we get into complying, and now we have already GDPR is already here so there’s no more time. But even before we start struggling with all of these consent management tools, all these pop-outs and all of these decreasing our sample. Even before you get into that, does it work today while we still… Or did it work before compliance? And I think that’s the question.

0:22:39 SM: And something that I wanna mention is that… Something I was reading the other day about the philosopher’s stone, which is that in the Middle Ages, there was this philosopher’s stone. People believed that it would turn any base metal into gold, and everybody was into that. All sorts of bright minds were into this. Isaac Newton was into this. Roger Boyle was into this. There were, again, bright minds and eventually this would disappear from their biographies for 200 years, but they were alchemists. And it felt shameful. Oh, they were trying to make gold, but they all believed it. And there were the scientists that really thought that metals were compound and there was a way to transform them and they were the dreamers, and both were getting the high on each other in a very similar way. And it got to the point where there was this law in England, it was some sort of law, mining law, I can’t remember the title, in 17th century against the multiplication of gold. It got to that point.

0:23:46 SM: And what I’m saying is that collective faith does a lot, and for the very same reason that we cannot really measure everything about people and one of the many reasons why I do not believe in attribution, which is that we humans and there’s all of these emotional elements and all of these… Because the brain is complex, and we’re not robots. And the same thing, I believe, is behind people asking for attribution, and people asking for a single customer view, and people chasing this holy grail, which is that we all believe it, we all see each other in these conferences, nobody has done it, and yet we keep saying… This is my opinion of course, I maybe extremely biased by my limited experience. But this is what I have seen. Whenever I’ve been called into a meeting with a CMO at a large enterprise, it’s been very few times as an analyst or as running in a team of analysts, it’s been invariably about predictable revenue, ROI, attribution, single customer view. That’s been the topic and it’s still the main mission wherever you go. And what I simply wanna state is that I do not believe in it and that we need to restart that. Even before we get into privacy.

0:25:00 TW: So in the law, the GDPR ePrivacy, those sorts of things are the equivalent of passing a law for something that isn’t really attainable. I think you’ve done this thought experiment, and I’ve done it a little bit, but probably not nearly to the depth that you have. There’s a massive, massive possibly insurmountable switching cost to change from the machine that is there now to what would be better. But I’d love to hear what your thought, if you could wave a magic wand, wave a philosopher’s stone at the way that we do this today or go back to the beginning, I’ve had some thoughts on it, how would it work differently? Putting aside the fact that you’d have massive legacy stuff to overcome, but what would be a better way? What would be that kind of ideal?

0:25:53 SM: Yeah, so I think it’s good that we have an environment that was built for documents. I don’t think that had to change. It brought plenty of value, but I think there’s another world that we can build around identity. And there’s plenty of people talking about this, and you may have read Doc Searls writing a lot about it and there’s plenty of people talking about this idea. So built around identity, with or without blockchain by the idea that you are a unit in this network and that it really is built on people and he gets you to this sort of nirvana as this Doc Searls would say, which he calls intent casting, where people just proclaim or declare their needs and suppliers, brands, they bid to satisfy those needs. That’s an extreme expression of a demand-led world.

0:26:47 TW: What’s an example? Like, broadcasting I want to buy a dress shirt that sort of need?

0:26:54 SM: Exactly that, yeah. And you’ve got a short list of favorite brands. I like these brands. I wanna buy this trip, who’s gonna give it to me and this is how much I’m willing to pay. And we’re getting closer to that. We’re already doing that in certain environments.

0:27:09 MH: Right. Like with Amazon or something like that.

0:27:11 SM: Exactly. That’s it. So if we get closer to a demand-led world with people in power in control of everything, even their data, then that would be the final outcome. But of course, we can’t just get there. So what we thought is, how do we get there and what’s in the middle? And something that is a stepping stone is privacy, is people owning their data in the environment that we have today and people deciding what they wanna share with each brand. And if you take that into our space, so analytics, in analytics, this would take us into people measuring themselves and think about everything that doesn’t work today because we keep tagging and retagging and retagging, and then there’s three TMS are the ones. And then by the time you’re done, then someone else takes over, they change the CMS, then you retag again. Just look at this thing we have today, so imagine if you could forget all that, I’m sure we’ll find a job working around that.


0:28:13 MK: Okay, I’ve gotta pipe in here. What about the old philosophy? And I’m intentionally being controversial, that sometimes customers don’t really know what they want and to be fair, today I had a very specific intent to go out and buy a set of white linen bed sheets, which I went to five different vendors, did thorough amount of research, worked out the cheapest price, and ended up with exactly what I wanted. However, there are use cases where sometimes users are not great at knowing what they want. So how would a demand type model work in that scenario if it’s like, “Well, I don’t know, I probably need something for this.”? I’m of course thinking about retail ecommerce ’cause that’s the space I work in. I need something to wear to this event, but I don’t really know what I want. Like, couldn’t that just also lead to a complete barrage of…

0:29:14 TW: Well, or a new space to mean that even the broader, if you think of people weren’t thinking they wanted an electric car or yeah, take a new product or a new service, or even I think when Sergio and the other, I’ll layer one more thing on and then you can respond. When you said, “Hey, if I know that I like these five brands.” But what about the 27 of star brands? I didn’t know somebody would make shirts that are made to be untucked. I mean, I still think that’s a little ridiculous, but…

0:29:43 MH: I believe all shirts are made to be untucked.

0:29:45 TW: I am that way and it turns out they all work that way, but apparently your hemline or whatever, needs to be just so. So on the intent when Moe wants white linen sheets, there’s that piece but how does that work when the intent, and maybe this is just the raging, rapid consumerism.

0:30:06 MH: That’s where advertising still has a place to participate, right?

0:30:11 TW: That’s kinda just broadcast or trying… Maybe it narrows down in a broadcast but not to the point of trying to get to I’m targeting you with an ad.

0:30:22 MH: Yeah, because a great ad hits a whole group of people like Moe, you, me, and Sergio all are extremely different from different places. And yet I bet all of us enjoyed the iPod ads from 10 years ago with the people dancing, with all the bright colors, those…

0:30:44 MK: Yes, I loved those.

0:30:46 MH: Yes, see? And it made you want an iPod. This was like the revelatory moment about advertising to me was the iPod was the first technology product that my wife came to me and said, “I think we should buy this.” I’m the one who wants to buy technology and…

0:31:07 MK: Okay, I have a whole another theory about those ads, but that’s like I feel like…

0:31:11 MH: But that’s what I mean, a great ad can function irrespective of how much of the layers of digital alchemy that we’ve put on them over the years, of believing that it has to be personal and targeted and all that kind of relevant stuff, those are all good for capturing at points of intent, but that’s a different kind of advertising than the advertising of building actual product, demand, or desire. Which is, Sergio, you do a lot of retweeting of a blogger that I really like, which is Bob Hoffman Ad Contrarian, and he’s constantly talking about this idea of, there’s never been an ad, a company, or brand built on online advertising. And that he’s poking at the same problem, Sergio, that you basically have described. And honestly, like Moe, for me this whole conversations will make you be like, doubt our existence. So I’m glad that you’re fighting the good fight on our behalf ’cause I’m like [0:32:18] ____.


0:32:22 MK: I think the difference for me is that I sit so much now on the product side. So I keep thinking, as we’re talking about all of these things, a few episodes back, when we were talking to James Fugleberg about location data, and one of the reasons I think that those iTunes-iPod ads worked so well is because it was about the feature, it was about how will your life as the customer be better by having this product feature, not about this ad for this thing that you looked at, it was about customer experience. And so I’m a very big advocate of, yeah, when it comes to… It’s not about selling a specific product, it’s about selling the overall package. And a lot of what you tell your friends about when you talk about the amazing experience you had was not, I bought a black dress that I was retargeted for, you talk about the overall experience that you had with that brand. And so much of that is on the product side. And my CTO is now gonna love me, my CFIO.

0:33:24 MH: Well, that’s right. But that’s what goes back to effective advertising. And we fool ourselves sometimes.

0:33:33 TW: When we say effective advertising, we immediately head within 13 nanoseconds, we’re back into some level of targeting, which means we’re heading right back down the same path we wound up in. Right?

0:33:48 SM: Yeah, I think you’re right. I think we are headed in that direction, in the direction of having a clear split between building brands and having advertising that is not about collecting as much data about people as possible to than justify an investment ’cause in the end, that money just is going down the drain, I believe. But instead, there’s this advertising that we are seeing again, which is contextual advertising, which is affinity-based advertising. And if you looked at what Google has been doing in order to comply with GDPR, and which has created a stir in the ad tech space, is saying either you, so as a medium, for example, the media companies, either you will gather consent on my behalf, or you won’t be able to do any of these things and then you will still be able to do contextual advertising.

0:34:43 TW: Wait, explain that. Explain that, consent on my behalf being the consumer’s behalf?

0:34:50 SM: So who’s got a direct relationship with people in the ad tech space? It’s gonna be the media and the brand ’cause you go on the brand’s page if you’re gonna buy something or you’re just talking to them, and then you’ve got the media where ads are placed. But everyone in between doesn’t have a relationship. Of course, Google has these two roles. You have Google when you go into Google, into their services and then yes, then you have that direct relationship. And then you’ve got Google as the ad tech player, where they’re just playing a role in that collection of ad tech technologies. And what’s been happening is that all of these platforms, for many of the things they do without data, they need to gather consent ’cause they can’t find another legitimate basis for that processing. Consent is the only answer very often. So if consent is the answer, the only way to get it is to ask media nicely to gather it on your behalf, and then pass it down the chain of custody. And if you look at the contracts that have been rolled, and everyone has had their copy of most of these, they’ve been all over the place. In the end, it’s all about that who’s passing the baton to whom, and in the end, how many companies are media firms going to have to be gathering consent on behalf of?

0:36:14 TW: And so on that consent guy, is that where you briefly reference blockchain or not? Is that consent and intent collection? Is that a centralized thing? Which, I think there would be all sorts of concerns and backlash? Or is that the sort of thing that could work as a distributed blockchain-driven ledger of the consumer expressing their consent for certain types of messages and ads as well as intent of what here right now, what I’m interested in? What’s the clearing house mechanism for the consumer to raise their hand on something like that?

0:37:00 SM: Yeah, so since the law, so GDPR is asking not just for you to gather consent but to be able to prove it, so you need evidence, then blockchain sounds like a perfect choice ’cause then you have a copy of that that nobody can alter. And then it seems like it was really fit for this environment, but in the end, it’s really not something we can use under GDPR. Basically, GDPR has certain rules with regards to international data transfers that you cannot really comply with when you use blockchain, not the public blockchain, and it’s got rules when it comes to right to be forgotten.

0:37:37 TW: Yeah, oops. [laughter]

0:37:39 SM: And it’s very hard. Exactly. So that’s why these people complaining now that GDPR didn’t take into account a possibility of blockchain. But of course, this was negotiated nine years ago. And I think it’s a great piece of work, but certain things will maybe come later. And blockchain doesn’t seem like now, it’s something we can use for consent gathering and the evidence of consent. So what’s been coming out are tools to store a reference to the point where you are giving consent, so there are these consent management tools, and then there’s all sorts of systems to pass consent from one player to another. The IAB has released a system called a DaisyBit, that’s how they call it, DaisyBit. And it’s basically a cookie that players are passing to each other that demonstrates that they have consent for everyone in the chain from the consumer when they initially saw that on the medium.

0:38:39 SM: In the end, bottom line is that we have been collecting all these data and yes, we’re hungry for more and we’re hoping that people are gonna leave their Wi-Fi on, on their mobile phones and their location on, and their Bluetooth on but that’s not really the way it should be done from now on. What we’re supposed to be doing is tell people about each specific purpose. If you are asking people to leave their Wi-Fi on, or their Bluetooth on, their location on for one specific use, and then you use that to do something else, that’s against the GDPR. It’s another purpose and it’s another legal basis. So whereas the first one could have been based on legitimate interest or a contract, the second one may need consent. And for some of these data points, you may need even a higher bar of consent, a two-factor consent, explicit consent. So it really gets so hard to comply with that I think instead of going crazy to try and fit the old world and the status quo into this new environment, why not rethink everything around a world that is really privacy by design conceived with people’s data owned by people at the center?

0:39:53 SM: And I think there’s plenty of room for that for people, again, measuring themselves. Imagine if you can have your browser extension and your app and you just collect data by yourself because you can expose that to brands that you choose in exchange for value. Not value in terms of money, of selling your fundamental rights but in terms of getting something in exchange, so a more accurate proposition. And then you will have these two worlds. You have the world of advertising in terms of building brands and the message and something that’s impactful and emotional, and that we all see together, and therefore, it has a bigger impact as Bob Hoffman was saying, for example. And then you will have on the other side, which is people declaring what they need.

0:40:38 MK: Okay, so I’ve got one conspiracy theory maybe. I don’t know what we would classify this as. I’ve seen a few startups lately that have this idea of like, it’s got your personal ID with all of your different data, pieces of information, I don’t know, whatever you wanna describe them, gender, age, blah, blah, blah, all the way through to things like your blood type. And you would choose on an application by application basis, what you share your data with. When I see those there’s this little bit of me, and this is where maybe the conspiracy theory comes out, that freaks out ’cause I’m like now instead of giving little bits about myself to multiple different companies, and yes, some of those companies are very capable and can pull all of that data together to come up with a profile. Now I’m giving a new company, literally everything about me and sharing… And they’re kind of, I feel like sharing a similar philosophy to you, but maybe not with as great motives. Anyway, conspiracy theory.

0:41:50 SM: I agree, I agree. I agree. It seems to be leaning in that direction. Sometimes you look at value propositions that go in that direction. I don’t think that is the way to go. I don’t think it is about exposing everything and your whole self to different players. But look at what we do today. Customer Data Platforms, sometimes DMPs, sometimes it’s a CRM project, and you four, we have all worked with these. And you go around on what is the mission. Single view of Tim. And you’ve got one Tim. But now there’s as many single views of Tim as brands he’s interacting with. And everybody is building that Tim for themselves, like this one you for PNG, this one you for Unilever, this one you for an ice cream maker. And in the end there’s only one you, which is you. So instead of inefficiently building million pictures of everyone and hoping that by gathering there in stealth mode, they won’t know. Therefore, we’ll build a better picture. Why not stop doing that? Let people be wherever they are and then come to them, ask them. You’re gonna have to ask them anyway.

0:43:09 SM: So if you look at GDPR, you really are going to have to ask them for every single data point, and it’s gonna have such a huge impact on UX. Look at what happened with the Cookie Law, it destroyed everything. Our experiences on mobile. This stupid cookie banner. It’s so silly. And now, we said, “Oh, we’re gonna get rid of that.” Or, no, we’re gonna have a worse one. Now, we’re gonna have these crazy pop-up with plenty of options that no one’s gonna read and it’s all gonna be about who gets closer to the red line and gets you to accept without reading much ’cause if it’s really a yes versus a no, you’re gonna say, “No.” And so, if you’re gonna have to ask people expressly every single time, then why not just ask them for what you really want.

0:43:56 TW: But then I’m gonna having to repeat myself ’cause… And I’m now remembering it, I can’t remember the name of the service, it was probably 10-12 years ago, along the lines of what Moe’s talking about of like, “I’m gonna have my central ID. I’m gonna manage my profile and then I can use that.” This was before you could use Facebook or Google to log in for a single sign on, that if you’re having to ask me if I’d go to a site and it’s like, “Hey, we just need to know if you’re a male or female or somewhere in between.” Like, the 15th time I have to tell a site that has a legitimate need for it, I’m gonna be like, “Man, isn’t there an easier way?” And up pops the company that says, “Hey, sign up for our service, tell us your gender, and now use us when you log in.” And now we’re heading straight down the path to what Moe had. But I mean, I guess maybe backing up and it’s funny because we totally pride ourselves on, we’re not getting people on to do product pitches or vendor pitches. I still don’t feel like I have a good sense of what Privacy Cloud, what that is. It feels like it’s something very, very different and so I’m gonna do the explicit, what the hell is Privacy Cloud?

0:45:07 SM: So there’s this part of Privacy Cloud, which is exactly what we just said. We are giving people an app so that instead of logging with Facebook, this is the first thing that we do. You do not log in with Facebook or with Google, you log in with iRule. So there’s an app called iRule. My data, my rules. That’s the name of the app. And the name may change in the US ’cause that’s only been rolled out in Spain and Ireland. That’s the pilot. But what the app does, so there’s the consumer side of Privacy Cloud, what it does is it has a single place where you have your data. Not all of your data, but a pointer to different locations where your data is being kept. And from that place, you can really exercise certain rights. It allows you to connect with brands and to do certain things.

0:45:55 SM: So, everyone knows about portability rights but there’s other rights that you can exercise in a more automated way. For now, all we care about is that we need an app to be… When you talk about an app, you have to go to a process extension or whatever allows you to log on across different services. We want, again, to cut out the middleman. And the only way to do that is by letting you access all of these services and know which services you are using so that you do not have to pay for them. And why is that? And this is the second part. The first one is the app. That’s the consumer app. Simple, one place, your data, single sign on, to not log in with Facebook anymore, log in with Privacy Cloud.

0:46:37 TW: But if you log in with Privacy, is it similar to when Google pops up and says, “Oh, you’re logged in.” Instead of saying I’ve got access to your calendar, and your contacts, and whatever. With iRule you’re saying, you wanna give me level one information, which is name and location; level two or something like that. Is that the idea?

0:46:57 SM: In fact, there’s three levels with three colors and that’s the same logic. But right now what we do is we replicate the data that you have in Facebook to save you doing that with Facebook or to prevent all the platforms sharing Facebook data elsewhere. But for now, let’s just say that you’ve got a place where…

0:47:17 TW: And you said it quickly. You’re just storing pointers too. I may store my gender in Facebook, and maybe that’s not the best example. But you would say, “I’m storing that your gender is stored in Facebook,” if I go in and sever that link or change my gender in Facebook then you’re not storing the customer data. You’re storing pointers to where they’ve said their data is. Is that right? Okay. For that, no. Okay.

0:47:44 SM: We do store some data. So if you go into the app now in Spain, you will see your data being stored because you are answering certain questions that we need so that you can then save that time. And we store the data and we let you exercise your rights automatically so everything is done from the app. But again, repeating myself a little bit ’cause I don’t think it was too clear. So there’s three legs to Privacy Cloud, and one of them is the consumer app, so consumers need to be in control of the data, and we don’t want them to be forced to using Facebook, logging with Facebook or logging with Google and not know who’s gonna get their data, and also be using Facebook as the broker. We believe they have to be in control plus that will enable other things that I can explain later. So the first one is the consumer needs to log in with Privacy Cloud. In Spain and Ireland, that’s called iRule. My data, my rules, that’s the app. So when they have that, we have basic information stored within iRule and that, again, saves you the time when it comes to signing up for a new service.

0:48:50 SM: The second leg to that, and I’ll come back to other things within the app, but the second leg is important because it’s about companies, media companies, entertainment apps that need to monetize in a post GDPR world. And these are companies that today are suffering because their ad-based model is pushing them into a paywall. Ads are giving you less money everyday ’cause inventory keeps growing, and everyday is worth less in terms of how much it costs to produce it. So everyone is following that same path. All of these media companies, in the end they arrive at the paywall and the problem is that we can only pay for so many paywalls. And so the user complaint is, “Can we have a bundle?” But who’s gonna do the bundle? And the reality is that you don’t wanna pay for everything, you wanna have VFT, you wanna have wired, but you don’t wanna read everything maybe. What if there was a way to know how much you’re consuming? And there’s been plenty of business models in the past about micropayments, reading an article on Medium, and all the stuff.

0:49:53 SM: But the idea here would be, what if we could have one single app, and then we help these companies monetize without the paywall or relying on their paywall, using their payroll and without ads, so that by using the Privacy Cloud app, you go into all of these systems, all of these services, you automatically get in, we measure how much real usage is happening, and then we pay for that usage. We don’t even need to collect any data and we don’t need those services to access any data because one thing is, accessing your data, getting to know you as brands will want to know you, another one is getting the money because they wanna be viable. So, we split that, who wants to access your data then that has to be in exchanged for value to you. Maybe some of the content will be free, and who wants to just simply make a living and then we have to be able to redistribute some money. So whose money do we redistribute? Who will pay for all these things? And that’s the brand. And that’s the third leg.

0:50:53 SM: So the third leg is the brands that right now are already sponsoring media and they’re already sponsoring your life through advertising. And that’s how TV started. So you get free entertainment ’cause the brands are paying for it. The brands are doing that, but right now there’s all these brokers in the middle, eating half of it and then this ad fraud. There was this conference in New York, the Digital Media Summit. They were saying that we’re gonna have $50 billion in ad fraud by 2025. So it is really a problem. So anyway, what’s happening is that these brands, they wanna get to you, but what do they really want? And if the theory is, it’s not really about pushing advertising to you for things that you don’t really want, but really is about either building a brand and creating emotional impact, or really getting to know you and be ready for that demand-led world.

0:51:50 SM: If that second scenario really is happening, and what brands really want is get to know you and not use the single customer view as an excuse for things that just don’t work. If that’s really the goal, then, okay, let them pay to get to know you, you as a consumer, let them pay for your Netflix, for your Spotify Premium, for your New York Times. And so that’s how we triangulate. We have the consumer and through the app, you can automatically directly access Netflix, you can access New York Times, if there’s a few newspapers just like 12 or 13 services really active in Spain, and those are being paid by the brands that wanna connect with you and you have this app where you have your data, where you know who’s accessing your data and where now you’re putting an end to this crazy traffic with your data.

0:52:44 SM: If you look at what’s happening with apps today, you install a free app for a game for your kids, whatever, and they first try to monetize with ads, then they saw it wasn’t enough so then they found a way to collect all of these data, and there’s so many apps that have to do that because they want to survive. So they access your photos, they access your location, your contacts, and then they sell that to the brokers. So it becomes third party data, onboarding data somewhere, and that will end now so that will at the same time end with the business model, and they need to make a living. So I believe that Privacy Cloud can provide a living for these apps and for the media that again are gonna be giving us more and more payrolls, except that it’s gonna be based on consumption. And at the same time, you’ll be able to build your own profile, measure yourself across different environments, and as you go through these media experiences, you’ll add more to your own understanding of yourself, which is another story. Sometimes what you know about yourself is little compared to what people really wanna know. That’s the pitch.

0:53:50 MH: So this is fascinating. And unfortunately, we do have to start to wrap up, but I think this is why we really wanted to have you on the show, Sergio, because it really does challenge fundamental concepts that we just come to accept. And I am very intrigued, and now it’s thought provoking to me to explore not only these concepts, but the utility of the path you’re pursuing with Privacy Cloud or whether that will be the right path or the next evolution. That’s pretty neat. Okay, so while we’re talking about all these new things, we still must hew to tradition, which means we do a last call. That’s one of the things we do on the show is go around, see if there’s anything interesting content wise that we’re willing to pay for.

0:54:43 TW: Angelique?

0:54:45 MH: That was in the last couple weeks that we would like to talk about. Sergio, you are our guest. Do you have a last call?

0:54:53 SM: So I found an article in the past, from a few weeks ago, and it’s from Hacker Noon on Medium, and I think it was really interesting, connecting these things that we’ve been discussing. But taking it a bit further, and it’s titled, This is How Google Will Collapse. And I love Google but I felt the article was very good because it really tells you about what if the advertising model is not the future? So it goes in another direction but I really liked it. So this is how Google will collapse, reporting from the very near post Google future. Again and the author, so on Hacker Noon, the author is Daniel Colin James.

0:55:32 TW: Nice.

0:55:33 MK: Sounds interesting.

0:55:34 TW: Wow, that sounds awesome.

0:55:35 MH: I propose for our show today. Alright. Moe, do you have a last call?

0:55:40 MK: I’m doing it, Tim. And I’ve got a two-for. So the reason that I have two is that my first one is that I’ve started listening to this podcast which has absolutely nothing to do with analytics. It’s actually to do with the work and the workplace and how you interact with people, but I’m absolutely loving it and it’s called Dear HBR. So it’s all about, I guess, think about it, if you have a manager you’re struggling with or a co-worker, or how you give good feedback, and it’s really those workplace issues. So I’ve…

0:56:14 TW: Which is not why you’re listening to it at all?

0:56:16 MK: No, I just genuinely like bettering myself, Tim, honestly.


0:56:21 MH: Don’t worry about Tim, Moe. I understand what your interest is ’cause I would like that podcast as well.

0:56:31 MK: No, it’s like genuinely, really good listening, even if you work in the perfect workplace. The second one is, one of my engineers in my team actually got me on to this, and it’s a blog which talks about the best fonts for sharing complex data. Now, the reason that it’s super interesting is ’cause it talks about different things like the size of the numbering of the font to make sure that when you communicate, say like a 111 number, it would look equal to a 333 number based on the font choice and all that sort of stuff. And I found it super interesting, so a really big shout-out.

0:57:13 TW: This is the Courier and Monospaced blog? No?

0:57:15 MK: No, it’s typography fonts for complex data… I don’t know.

0:57:20 TW: Okay.

0:57:20 MH: Nice.

0:57:21 TW: We’ll find the link.

0:57:22 MK: I can tweet the link, but thanks to Ben and my team for sharing it.

0:57:26 TW: Hopefully, it’s in the show notes, and our professionals always… Sergio came with author, title, and I was making a wise crack.

0:57:32 MK: Geez. Geez. I was getting there, Tim. Give me a break.

0:57:36 TW: Such an asshole.

0:57:37 MH: Yeah. Very few of us could be as prepared as Sergio. C’mon.

0:57:42 TW: Yeah. He’s got multiple computers, multiple mics, multiple tests.

0:57:47 MH: Exactly. Alright, Tim, this is your chance to show Moe how it’s really done. What’s your last call?


0:57:54 TW: Oh dear, the pressure. Moe and I both had various color palette type last calls, but I’ve recently, courtesy actually of Darren Young, who does our audio editing for mixed media, actually shared this with me. It’s called Viz Palette. And basically it’s a site where you can load in a color palette and it then shows you various visualizations using the palette, and then you can click through to four different types of color blindness. And in each one of those, it’s got a little color report that draws links and says, “Hey, these colors in this palette, due to anomaly color blindness, might be difficult to distinguish.” And it was cool, like Search Discovery since we’ve got a rainbow palette, I was able to actually load in our entire palette. I wind up with one URL, I actually shared it with our designer. Then it lets you see, hey, for who might different things be a little tricky. So, just another way to come…

0:59:04 MK: So the color palette sucks to summarize?

0:59:08 TW: No, the color palette’s fine. We have a couple of shades of green, so even in the no color deficiency those are… But you’re not gonna use those right next to each other. So it’s just another way to come at assessing a palette and trying to figure out how you might need or want to tweak it. I said I was gonna do one, but now my second one, because now I think Sergio’s indicating that we can talk about this. But Sergio, do you have a…

0:59:38 SM: Yeah, yeah. So yeah, we released Digital Analyst’s Guide to Surviving GDPR Guides.

0:59:44 TW: Which I…

0:59:45 SM: In case anyone is interested.

0:59:47 TW: Yeah, it is having read a late draft of it, and having attended various webinars is confusing as it is. The fact that it is specifically… And the kind of the setup in it is, look, GDPR applies to everybody. What does the digital analyst need to know? And I thought it was fantastic. We weren’t sure whether we could guarantee that it would be available by the time the podcast released, but it sounds like you just committed to having this ready and in the public sphere.

1:00:19 SM: Yeah, yeah. Yeah, that’ll be in Sweetspot website.

1:00:21 TW: Okay. Sweetspot Intelligence, okay. Michael, what’s your last call?

1:00:27 MH: Well, since we’ve now proven that all this data collection we’ve been doing is fool’s gold, or a digital alchemy, or philosopher’s stones, but still there’s no reason you can’t do it at a high level. Recently I was reading through a white paper that was published by Joel Stachowicz, an architect at Adobe, talking about performance management around the Adobe Analytics Cloud. And I thought it’s very unique for a company to put the time and effort into helping create some best practices around managing client side code effectively to track back to performance. I was enjoying that. I’ve been thinking about standards a little bit in the context of that myself. And so it was timely for me. If you’re into the implementation of analytics and client side technologies, which a few of us are, I would say it’s a good read regardless of whether you’re an Adobe user or not. And there’s an article on Medium and then a link to a bigger white paper from there. But yeah, it’s good heavy stuff.

1:01:40 TW: Cool.

1:01:41 MH: Alright. So you’ve probably been listening to this episode, and you’ve probably been thinking two things, maybe three things. You’ve been thinking, “Oh, no, my entire career is a sham.” So that’s one. Two, this Sergio guy sounds like he knows what he’s talking about. And three, how do I get access to iRule app? There’s so much going on here and we’d love to hear from you. There’s a lot of different ways for you to contact us. As evil as Facebook is, we do have a page there and you can contact us that way. Although then Mark Zuckerberg will know that you’re contacting us there. You can also talk to us on the Measure Slack. It has not yet come out how Slack is evil, but I’m sure they eventually will be found out.

1:02:28 TW: Go freakin’ in that direction. [chuckle]

1:02:30 MH: Yeah.

1:02:30 TW: It’s not gonna matter. [chuckle]

1:02:31 MH: I don’t know. [chuckle] And then of course, on Twitter or our blog, we’d love to hear from you. Sergio is also active on Twitter. Sergio, what’s your Twitter handle?

1:02:44 SM: It’s Sergio Maldo.

1:02:46 MH: Sergio Maldo. So easy to find, a lot of really good information. So Sergio, thank you once again for coming on the show.

1:02:55 SM: Thank you all.

1:02:57 MH: We might need to remain it the Digital Alchemy Power Hour instead of Digital Analytics.

1:03:00 TW: I think you should.

1:03:01 MH: Yeah, okay, well, at least we’d keep the acronym the same. But anyways, from my fellow wizards, Tim Wilson and Moe Kiss, keep pretending to analyze. I don’t know.



1:03:20 S1: Thanks for listening, and don’t forget to join the conversation on Facebook, Twitter, or Measure Slack group. We welcome your comments and questions. Visit us on the web at analyticshour.io, Facebook.com/analyticshour, or at Analytics Hour on Twitter.


1:03:39 Speaker 6: Some smart guys want to fit in so they’ve made up a term called analytics. Analytics don’t work.


1:03:49 TW: You just did the elderly parents trying to use Skype impersonation.

1:03:53 MH: It’s not like… I don’t know.

1:03:58 MK: I’m on the other side of the world Tim. I don’t see all of the happenings of the analytics community from down under.


1:04:05 TW: Oh, look, you’re apparently gonna be typing to Michael privately. Is that how you guys roll?


1:04:09 MK: Oh, I don’t even know what that is.

1:04:12 TW: Apparently, there’s a back channel on this?

1:04:15 MH: Yeah, Tim.

1:04:16 MK: I saw that and was quite confused. [chuckle]

1:04:18 TW: I’m starting to get a complex here.

1:04:21 MK: Keep in mind it’s 9:20 at night here. This is why I invest on the show.

1:04:27 TW: You literally complained when we’re recording when it’s your mid-morning and now you’re complaining…

1:04:33 MK: I know. I think I weirdly am better in the morning, I’ve decided.


1:04:38 MK: How do you handle it? What’s your go file?


1:04:42 TW: This is a hypothetical? [chuckle]

1:04:44 MH: Yeah, I mean…

1:04:45 MK: Situation, I’m curious.

1:04:47 MH: Unfortunately, yeah, Moe, I’ve just never been in this situation.

1:04:51 TW: Yeah, I have no idea.


1:04:57 TW: I don’t know, maybe quit your job and move to New Zealand.

1:05:00 MH: The United States, yeah. You move to the United States.

1:05:03 TW: Yeah, the standards are a lot lower. That’s all we do is fuck up.

1:05:08 MH: Yeah. Everyone expects it over here. I’ve apparently been stuck in a well for most of 2018.

1:05:17 MK: Seriously, Sergio, this doesn’t ever happen.

1:05:19 MH: This is how important this is, Sergio. The internet does not want this to get out.


1:05:25 TW: Hold on a minute. Honey, pack the bags, we’re moving to Spain.


1:05:27 MH: Yeah.

1:05:27 TW: Oh, sorry. You’re welcome.



1:05:31 MK: No, that’s his normal face. I promise that’s the normal face.

1:05:37 TW: Rock, flag in alchemy.


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

The Digital Analytics Power Hour © 2019