#025: A/B Testing with Kelly Wortham from EY

We had a hypothesis that our listeners might be interested in hearing an expert on digital optimization. In this episode we test that hypothesis. Listen and learn as Kelly Wortham from EY runs circles around the lads, and brings them to an understanding of what digital testing means in 2015. In an hour so optimized it only takes 45 minutes, it’s 2015’s penultimate episode of the Digital Analytics Power hour.

People, places, and things mentioned in this episode include:

  • Taguchi vs. Full Factorial test design
  • kelly dot wortham at ey dot com (to get added to Kelly’s twice-monthly testing teleconference)

 

Episode Transcript

The following is a straight-up machine translation. It has not been human-reviewed or human-corrected. We apologize on behalf of the machines for any text that winds up being incorrect, nonsensical, or offensive. We have asked the machine to do better, but it simply responds with, “I’m sorry, Dave. I’m afraid I can’t do that.”

[00:00:25] Hi everyone. Welcome to the digital analytics power hour.

[00:00:29] This is Episode 25 you know and a slight paraphrase of the RBM classic everybody. Sometimes. Have you ever asked yourself the question why not all the time. I know we have. That’s what our episode is about today. So we’re heading for testing excellence straight as the crow flies. I’m always joined by my intrepid co host. That would be Tim Wilson senior partner day analytics demystified the standardized testing because standardized tests rated standardized testing no animals left behind. And of course his sworn enemy in Ottawa Canada the double CEO of napkin and beverage systems. Jim Kane hi can you sing the whole episode please.

[00:01:18] Well adorable.

[00:01:22] I think I think we have you singing gem on pasturelands it will never be topped. So.

[00:01:28] OK. To help us in this conversation. Who better to lend a hand than our guest. The year of variations the Sultan of significance and the Nostradamus of null hypotheses. That’s right. I’m talking about the one the only Kelly were them. Welcome Kelly. Thank you. I know that I made those titles up so it was not Kellie’s like Kelly didn’t tell us to read that. No other people at you know people that money’s on the way. Yeah you know let me give you some background about Kelly’s. Currently she is the conversion rate optimization evangelists city UI in their analytics and strategic advisory practice and she is leading the charge on optimization there.

[00:02:12] Prior to that she led the AB testing program at Dell and she’s been out in the industry in this space for 15 years. We’re very excited to get such a seasoned professional in our field and in this field specifically around AB testing conversion rate optimization. So it’s a real pleasure to have you here. So let’s dive right into this. You know we can go down either question or Question B. So which one do you guys want to start with.

[00:02:38] Now can we ask one person a one person be and have them answer simultaneously.

[00:02:44] I think that’s a good first question for Kelly. How would that test work.

[00:02:51] I think tonight we’re going to make it so easy.

[00:02:55] We’re going to be worried a lot about interaction effects because we’re going to be all over the place. Let’s get in there. I’d love to start by talking about the process of setting up and running a testing program.

[00:03:06] I think the world is sold on the concept of testing right. Everyone believes in it. Everybody wants to do it. And whoever’s doing a little wants to do more and whoever’s not doing any wants to do some. But it’s pretty rare to find organizations that are doing a really good job. So I’d like to throw that out. First is what goes into building out a really good testing program and how do you what would be the first steps you might take.

[00:03:30] Oh good start with any questions of course. That’s right over the Middle East right there.

[00:03:37] Yeah a low hanging fruit. Certainly there is no simple answer for how you do it right. There is a lot of answers for what you shouldn’t do but I would say that the number one thing that I see organizations that want to get into testing do wrong is that they fail to do the upfront planning and creation of just the simple questions that you need to ask you know what are the business problems that we’re trying to solve.

[00:04:01] Something we do with any analytics problem wouldn’t do the same thing with their testing program and you need to determine what is the goal of the program what it what are we trying to fix what are we trying to address what are we trying to improve. And you need to ask all those questions to get all those answers before you determine what your program is going to get before you even choose what tool you’re going to use to get there. And often I see those questions being asked one two years out once the program has stopped you know delivering these massive returns on those initial low hanging fruit. So that’s the first thing I think is crucial just like it is in analytics I think it’s crucial in testing. The second thing I think it’s really important is to not just focus on that on the low hanging fruit on 15 the things that your analytics team or maybe the business identify hey this is broken and you know you can see in the data it’s broken. Just fix it. You don’t necessarily need to spend your limited resources on expensive potentially expensive maybe testing when you can just go fix something so you have to determine when you should test and when you should just go in. And a lot of times people come out the door and there was even the book written you know always test always be testing abt. Right. And I just I think that that sets you up for failure because you know you do have a limited resources and people and traffic as well as of course finances.

[00:05:32] And if you can target those testing efforts to the areas that make sense where you actually need to get an answer because the analytics can get into that answer then you have a better result out the gate and you have a longer tail for success with your program.

[00:05:49] So have you seen cases where basically analytics is clicking along and they’re just kind of keeping a running log. These are the things that can’t be answered with analytics or can meaningfully be answered with analytics and kind of using that to build up a case to say this is not just a case but some level of focused testing. Is that is that one of the approaches or not that’s that’s utopia right.

[00:06:13] And I have seen it. It is rare but it requires close alignment between analytics and testing or or the possibility that even the intellects and testing are within the same organization.

[00:06:24] But there has to be this organizational consensus that Analytics has to answer questions and that analytics won’t be able to answer all the questions and that that requires honestly for the analytics folks to be kind of honest as well I guess because nobody wants to say hey I can’t answer that question. But when they work really closely with the testing team it’s not I can’t answer that question. It’s hey I need help trade. I’m going to fill the friend when it’s on the testing program and we’re going to get that answer to you together and they and they work at it together. So you have this beautiful backlog of here’s to here’s all the questions that we’re asking the analytics team anything that they can answer we can answer we can go 6 or go do anything that they can answer they make a recommendation to go test. And based on the analytics they make a recommendation of how to test it and what the hypotheses might be and what those recipes might look like. And then you have a much stronger test design and a much better chance of getting successful results.

[00:07:24] So heading right off on a tangent there you were talking about the analytics and the testing team so getting started for teams that certainly analysts analysts want to be doing testing especially if they feel like they’re in a rut. So do you see it as typically you really need to say we’re going to do testing we need to have a testing organization that is parallel to the analytics organization we need to staff that independently or can there be a dipping their toes in the water by saying we’re going to we’re going to get a designer from you know a little little bit of support from our creative team or agency and we’re going to get a developer who’s kind of interested and we’re going to have an analyst come to dip their toe in. So do you see that as kind of right. Typically you have to say look this is a a separate group that needs to be established.

[00:08:15] Yes. I mean how many times do we see organizations willing to throw money at tools software but not for people. Hate your people are your most important resource.

[00:08:27] Well that’s all you know we’re always be tested optimized is like just put a snippet on the site and start testing you’re done right.

[00:08:37] It’s all good. Anybody can do it. It’s a drag. It’s amazing.

[00:08:44] Yeah I get that a lot. I get a lot of. Of course the testing tools do their sales too. You know the hippos right.

[00:08:54] They go to the executives and they show them how easy it is to use and they you know they convince the executives that this is the right tool and that we don’t need to hire really talented people because even I could do it right. And then what happens as well. Yes that’s true the out of box functionality is super easy to use. And you know like fifth grader could probably figure it out. But it doesn’t work on your site. Great. Because you have some custom implementations requirements that end up making you do that all and Dom manipulation Javascript. OK now we need to hire a developer. Oh by the way you know the test designs that we start throwing out you are a little bit more complex than just you know moving things around the page. So now we need a designer because you know we want to change more than just the location of this banner. And as it gets more and more complex you need more and more resources and you start borrowing throughout the org and it’s very disjointed and people get jealous because you’re taking their people and it just goes crazy. But if you start the whole thing and you say look I need developer I need one designer and analyst and I need a project manager and I’m starting with those people and that should be led by somebody who is familiar with a detesting start there. You know you can even use point 5 FTE resources if you need to so that you can build a dotted line team.

[00:10:20] But don’t think that you can start testing just by saying hey I’m going to take this analyst as part of my digital analytics team and I’m going to give this person testing and I’m going to make put them over testing and everything is going to be great because that’s just not enough. It’s not just one thing it’s not just you know being an expert in you know one piece of the puzzle here won’t help you solve the puzzle.

[00:10:43] So it is a testing expert or a conversion rate optimization professional. Is that one of those. You hit a certain level as a digital analyst and you’re just kind of wired that way and that’s something you graduate to. Or is it really in your estimation a totally separate discipline that comes into measurement.

[00:11:02] That’s a fantastic question.

[00:11:04] Don’t cry. And that’s not that that’s not me that’s not me just hesitating. That’s a really good question.

[00:11:10] I think it’s a little bit of both. I think it is possible for any digital analyst and I am making the distinction there because there is a difference. As we all know in the type of data that you have in the analytics and digital analytics so I think it’s possible for any digital analyst to learn and change their mindset. And in a way to really fully embrace what’s possible in AB testing multivariate testing conversion rate optimization whatever you want to call it I think it’s possible to get there. It’s also possible to start there whereas I don’t think in digital analytics that you could say that you could have an expert in digital analytics. That’s also an expert in testing that’s never touched testing and testing you could be an expert in testing and not have additional Elex background which is I think that makes it unique because it’s really more about understanding a lot of technical stuff the implementation stuff the backend stuff. But as long as you can read a report or read out a deep dive analytics and you can understand the very tiny basic elements of statistics you know whatever you took in you know Stats 101 or whatever if if you got that level of understanding you’re OK and you can really become an expert in your field. I don’t think that’s the case in digital and digital just so much broader you have to do so much to become an expert. Does that make sense.

[00:12:40] Yes. I’m surprised at the direction because again testing to me isn’t harder than analysis but it does seem like no level up you know. And so I was surprised to hear you say that it’s possible to get into a career in testing without ever kind of becoming or having to start off as being good at working with data.

[00:13:01] I will tell you that at Dell I had a team of absolute rockstars.

[00:13:07] I still miss every single one of them and I want to work with them again. And I’m talking about the testing side not the analytics that support the testing and on the testing side. I had three that came to us from the analytics side that were Annenberg’s with kids that just really loved supporting the testing team and decided that they wanted to start managing managing testing programs. And I had two who had absolutely no analytics background. One of them was a certified schoolmaster and the other one actually came to us from I.T. And you know he was just fascinated with the stuff that we used to ask him to do to help us make tests happen and he you know came over and ended up managing some of the most important work that we had. So give it to people with no background not only in analytics but no background in digital analytics who were fantastic and three who were so and all five of them I would say are now experts in the field after a bit of a testing. So it does it does it help. Sure. But I also think sometimes it can get in the way because sometimes you know that the folks that don’t come from digital analytics their minds are a little bit broader. They ask bigger questions sometimes because they’re not limited by what they think they can answer with.

[00:14:28] DADE Well that’s interesting. Yeah I’ve always said it’s kind of a testing capability as part of an analyst toolkit but I like your point Kelly that someone could go into testing and optimization specifically without necessarily building five years of competency. Digital Analytics and I think that’s something a lot of organizations probably don’t think about or may miss when thinking about how to build the programs. And a lot of times you know you’ve been kind of hitting some of the skill requirements and things like that people with the real affinity and understanding of User Experience Design and those kinds of things can be phenomenal at proposing and ideating around really great testing opportunities. Whereas to your point I’m just fascinated your last point was really good which was we do sometimes get stuck in the data and we know what data we can or can’t trust out of our tools and so we tend to work from there and use that as a way to live. And so that’s really cool.

[00:15:29] I’m not excited about the scrum master thinking you know there’s so much that if you had an agile myself if you were coming in with an agile mindset that doesn’t lend itself to testing more where it very well you know it does seem like that’s kind of congruent with the concepts whereas sometimes with testing it’s like oh I need to run this until I have over.

[00:15:48] Over and over and over thinking like I needed to meditate. Yeah yeah.

[00:15:54] Right. Because the analyst is looking at it from a statistical perspective and they’re saying look you know you’ve only reached 85 percent confidence and you need to run this test this much longer and therefore we don’t get any answers. The person with no analytics background is going to we’re not saving lives. You know before we ran this test it was 50/50 now it’s 85 percent. You know what do. Let’s make a decision and move. And you know it allows you to move a lot faster and it can be a little bit more risky too. But you know again we’re not saving lives here we’re talking about very very minimal risk. And sometimes I think analysts can you know we’re kind of used to somebody asks you a question and they give you you know six weeks to give them an answer and that’s not always the case with test and they’re like OK that three week window you get this much traffic. Give me an answer and the folks in the tech industry kind of get used to that and you know you can call it lower standards you can call it whatever you want but at the end of the day you know we’re both giving the answers. They’re just you know different types of the answers maybe with different level of confidence and maybe you don’t want to put as much weight on the answers with a lower level of statistical confidence but at least it’s better than you know that that coin toss.

[00:17:06] I was actually talking to somebody within the last week where she was saying no you have to I can be 95 percent confidence I’m like you know I’m not an expert but I think I’ve talked to people who are like you know these are people in marketing and it’s messy and that’s because it’s setting a really high bar where you’re going to have to collect a lot of data for a long time on a fairly substantial change. I had a letter that goes like come on guys like let’s let’s live it we’re in the real world we’re not just in academia we can drop that a little bit right.

[00:17:39] You know this is really weird analogy but you know forgive me whatever my kids are running about in school. You know they all said that with the advent of like the washing machine and the dishwasher and all these things that you know people wouldn’t have to do as much housework but statistically that’s not what happened. Plus she did more washing dishes more often to wash the clothes were often because it was easier to do. So it actually increased the amount of work. Ironically what we do with data is when the moment we put that those statistics out there and we tell people hey you know you want to reach 95 percent confidence and we think oh we can just you know leave it and go. That’s not what happens right. We’re like oh I let it burn a little bit longer. You know I’ll get to that level of confidence. And this is ridiculous. There’s no need for it. There was a organization that I had a chance to work with since joining the Y. That created a algorithm to determine what their statistical competence should be for each and every test they launched based on the potential risk and the potential impact. And they said look if it’s if it’s low risk and the impact is low then confidence can be low. If the risk is low and the impact is high positive or negative then we can also keep the statistical confidence kind of low because we’re not going to hurt ourselves. And there’s a chance that we can get a good answer quick and go there’s a work to do in running it. Exactly exactly.

[00:19:14] That’s your test that you can run. Right. And if I have something with a risk is high now I’m going to jack up my statistical confidence and the requirements you know with that and make that test will run longer because you know there’s there’s a reason maybe it’s in your checkout funnel or something. To that the risk of being wrong is higher. And I just thought that was such that was the first time that I had talked with an organization that was a little bit more laissez faire where they were they treated.

[00:19:43] What kind of a retailer are they.

[00:19:46] You know we can neither confirm nor deny that the company we’re discussing ahead flicks a risky idea nor deny it or that.

[00:19:57] This is a he always does I was there you have to put like Bonnie Python intermission music over it.

[00:20:05] So if you are not required to answer the question I’ll retract the question.

[00:20:12] It seems like a really mature that doesn’t seem like a concept that was a junior analyst positive. That seems like a really sharp executive. That said Well shit you know let’s not treat all things the same.

[00:20:27] Yeah it was an MIT graduate statistician so he said you know we don’t need to do this and that and you know they were able to to set up the model and prove it to the event.

[00:20:42] Tim Veevers is he’s taking that as a validation of all his testing effort.

[00:20:54] Thank you. Love it. OK so this is this is great. I don’t know to what extent. Kelly you want to dive into the weeds of testing but I think our listeners probably would get a lot of just things like hey what steps go into setting up a good test or hey you know we’ve been talking about this confidence and those kinds of things. How do you decide which tests to run those. I mean those are more tactical concerns but I think would be interesting and again I don’t want to sort of ask you to like spill all the goods but at same time I think everybody would love to hear and we can compare notes because you know I only test 95 percent confidence and that’s it.

[00:21:32] So I just learned something anyway.

[00:21:37] No. Back to Back to business. Well drawing on top of that to me one of the things is I got into testing and I’m not heavily into it now.

[00:21:47] The thing that I feel like people miss the most often is the number of roles like we touched on it but if we talk through kind of what the pieces are and why you know analytics is pretty low risk from a you’re not going to break the user experience by patching a data warehouse export from Adobe Ventures. Whereas testing seems like that does add complexity. So when it comes to kind of the nuts and bolts and what goes into a test like who the people and the skills and the role like you mentioned in one of your blog posts the you know all of a sudden the creative approval. Because yes some of your potential consumers are seeing a different experience. You know analytics doesn’t have to worry about that right.

[00:22:29] Yeah I mean it of course all depends on the complexity of the tests you’re doing. But I have you know the most complex tests we’ve run you know we’ve had to get approvals and insights and ideas from you know the pricing team that leads those team the brand team your business owners the design team us because they’re not always the same. And then of course here you’ve got the analytics the developers and the Kuwaiti team and all of those people I know in my experience the best test managers are Pat herder’s and they really are spending all their time getting all of those different pieces and all those different people are aligned about the goals of the task and you know keeping everybody in lockstep to move things forward. And again it totally depends on the complexity of the test. But you know if you do have to get all of those different people to touch the task that at one stage in the other it makes for a much more complex product development before you can get the test launched. But it also on the backend as you’re analyzing it and you want to get a read out you have a whole bunch of egos in the room that have opinions that had very strong opinions probably and some of them you’re going to have to tell them their baby is ugly. And it’s not easy to do. So not only do you have to be a pat Verdery to be a diplomat said. It’s a very unique position in in my experience anyway in the industry.

[00:24:05] So a bar manager would probably make a good testing team.

[00:24:09] Yes yes it is. Scorer’s harkening back to Episode 2. How do you run into dude.

[00:24:19] Do you actually shy away from test that that are copy changes that would require legal. The legal is one I haven’t run into because the tests I’ve never kind of done offer or pricing or that sort of testing. That’s something you regularly see on teams that there is a legal review for for any tests or for certain types of tests.

[00:24:40] So some of my clients are in you know the financial services or financial service exactly financial services healthcare pharmaceutical. You know those types of organizations and everything they do goes through the legal which makes it very very challenging and dramatically limits the amount of testing that thing X are able to do.

[00:25:01] You asked an interesting question though which was would you avoid. You know if you had the opportunity to run a test say you’re you know like like I was back at Dell. Obviously not everything had to run through legal but some of the most important test we did didn’t have to run to legal.

[00:25:17] And so you ask would I avoid him. And the answer is an emphatic no. You know I believe you need to divide your traffic into sort of a strategic lane that is saved for those big important tasks that may have a longer upfront development time and run time and then another you know the majority of your traffic depending on how much traffic you have maybe 75 percent of your traffic you’re running those continuous optimization operational type tests. But you know the pricing elasticity is an amazing cost to run offer Tayyab can be incredibly powerful but so can no subscription rate offers not just offers but you know different ways of illustrating the same offers and all of that is going to require you know pricing and legal and just have a lot more hoops to jump through.

[00:26:10] So I’d like to explicitly call out something that’s happening right now for unless I’m wrong and then tell me I’m full of shit. But sure as Mr Hagel you we will not be the first person to do it to me probably today tonight. So a lot of people who are more junior like newer members of my team when we talk about testing to them it’s red button versus green button or Oppy avers is copy b in the conversation that you and Tim are having is much more sophisticated than that. So every aspect of a user’s web experience can be tested and so you know you’re talking about price and you’re talking about copying your to offer like it’s not red button versus blue button. Could you make a list of. You know I want to get in the testing and wrap my head around it. What are the key types of things that are going to be tested and hopefully touch on behavioral targeting as well. Here’s my favorite chair.

[00:27:00] So I’m going to get there and they’ll bring me back if I get too far off.

[00:27:05] But I wanted to say one thing because before we got off onto the last discussion you asked one of the original questions of what you need to do to have a successful test which I think fits nicely with this question that you just asked and I wanted to answer that that I really feel passionately that the number one thing you need to do when you’re thinking of your task regardless of whether it’s placing the rusticity or even button color is to determine what it will mean if your results are X Y or Z. So you’ve already you’ve gone through the effort of saying this is the question this is the goal. You know whatever. And this is our hypothesis and here’s are different recipes. Now you have you know hopeful you have a KPI. And you’ve said you know this is the main thing we’re very focused on and you may even have some secondary metrics that you want to make sure they don’t go down dramatically right. But you’re just going to focus on that KPI and you’re going to say is B is higher in. I don’t know. Maybe conversion rate than a it means this and we will take this action if C is greater than a hey but not b it means this and we will do this to create this huge tlong matrix that says all the different possible results of your tests and the process of doing that you will learn very quickly that you don’t need a B C and D. Or maybe you need the N S as well because you haven’t been able to.

[00:28:41] You have to set up your test in a way so that each recipe gives you a different answer. And if you don’t do that then you’ll have a result but you won’t know why and if you don’t know why you can’t and then you’ve just wasted everyone’s time and resources to get Medicaid.

[00:28:59] I’m speechless.

[00:29:02] Music to my work is what you’re saying is hey before you go and do a bunch of work why don’t you think through exactly how it could play out instead of saying hey let’s do the work. And we assume at the end of the day there has to be a magic answer in a clear way forward. It’s saying it’s a hell of a lot cheaper to sit with a white board and kind of play the reasonable set of scenarios and saying Is this going to make us behave differently in a meaningful way tomorrow than we do today. Right. So I am all for it does it.

[00:29:36] This is the evil cousin of technology being just the solution is Oh just do stuff and then expect that just envision that at the end of the day the answer is going to roll out and it’s going to be crystal clear and it’s going to point a clear way forward as opposed to saying how about if we just try to draw a map and see where this might go.

[00:29:58] I think two or three exchanges ago when Lanphier from Best Buy said is the juice worth squeeze. And I will never forget that phrase. It has totally changed the way I respond to every business request that comes my way. It’s not just you know it’s going to be worth our time and effort it. Are you going to make a decision based on this information and or not.

[00:30:26] We don’t need this recipe.

[00:30:28] Yeah. And depending on the organization that answer could be different. So it’s absolutely. It’s not always the most obvious thing. And it also puts to bed kind of that concept that you hear sometimes like there’s no such thing as a failed test just learn things and it’s like when you just get up to run tests willy nilly and it’s better to have a plan. Definitely see the end.

[00:30:51] If it is it is you know there are a lot of purists out there that do believe passionately that the purpose of testing is learning but there was aid workers specifically.

[00:31:03] Yeah yeah let me just start with here.

[00:31:07] I mean already through I think that the purpose of analytics re analytics helps you learn optimization lets you do. And if you already have analytics to learn you don’t need your optimization program to learn new tools you just need your optimization programs to do. And sometimes your optimization program can provide evidence to support something you want to do to you know fight the you know the good battle to stand up against Team X or Y at the organization. And you know in that case it really is a learning but it’s a learning within intended action whether or not you achieve it or not is you know up to the organization. But I think all tests should put on the glasses the guys the intent of what action are we going to take based on this. These were sold.

[00:32:01] I love them but let’s get to the question that I think everyone else has been waiting for a puzzle now to Gucci or full factorial now.

[00:32:11] Ah I I am deeply deeply sorry for all of my Adobe friends and families.

[00:32:19] But I am not a fan of the. I am NBT only full factorial and in fact if you think back to my description of what I think you should do with every test as you’re designing your A.B.C.. Yes I’m actually describing a sectorial AB test.

[00:32:38] What I love about that right there is that that was actually a joke but you actually Ballala very faithfully like this is like wow I should. That’s pretty good.

[00:32:57] Usually that question comes up when somebody in the room wants to seem smart and has not used the app.

[00:33:05] Typically what is brought to bear.

[00:33:08] No let me answer your other question about Tess about button test or what tests are. How do you know it’s a good test or what test should you test.

[00:33:15] One of Jim’s questions. We just kind of humor him we just kind of try to move on after he’s asked a question you want to be that kind of guest. Go for it.

[00:33:23] No I think it’s a good question because in my experience organizations that are new to cousine that’s the first thing they do. They are like oh we’ll just check out our calls to action. Let’s try this banner that banner. Or you know it use behavioral targeting to determine what we’re going to put on the banner.

[00:33:39] It’s not that those are bad. It’s that you’re not asking the right question. Why do you want to change the button color. What evidence do you have.

[00:33:50] What data do you have. What voice of customer research have you done that says that the color you have is not good. So if you can come back to me and you can say look we’ve looked at that template a b and c on our site and they use different calls to action and we notice that in templates C people click on the call to action more and when we look at the different version C has a different color button but it’s also just higher contrast versus the rest of the site so we think it’s drawing more attention more eyeballs to the call to action and people are more likely to click. So it’s not that we want to test changing the colors of our call to action buttons. We want to jack up the contrast in some way and we think the best way to do that is to change the color and by doing that we are testing the theory that higher contrast calls to action will drive more click through. Now if you run that test it’s the same test as a button color test but on the other side of the test you haven’t learned that Blue is better than Green which is a bunch of B.S. as we all know. Instead you’ve learned why it’s better you’ve learned that high contrast better than low contrast or maybe you’ve learned that the all caps is better than not all cops or whatever the situation is to to draw the eye.

[00:35:12] And it’s because the truth of the matter is I will tell you that every single time every organization that I have ever test done any testing with has a running button test it has won because change draws eyes. It’s simple you know. So now you need to. Instead of doing that you had that business case fight to design a better hypothesis and better recipes. And you know you need to know up front what is your results matrix. What does that mean if X Y or Z happens.

[00:35:45] And that’s why you get teams saying we’ve used a b testing to improve our Web results by 4000 percent and then the main metrics and they are the exact same as they were before. Is because those kinds of changes regress right away to the mean. It’s just that initial look and feel and that’s that’s really that’s right. That’s great. All right so as much as I really am not liking that we’re getting to this moment we really have to start to wrap up and before we kind of do around Robin and sort of talk about new interesting things we heard her parting shots whatever. Kelly I am so glad you came on to the show. I think this is such a great topic. And you know I think I speak for all of us. Covered it with such aplomb and you know your deep expertise is very evident. So I think our listeners are going to have probably lots of great questions. So obviously you should probably follow Kelly them on her Twitter account at Kelly wear them which will probably then force her to tweet more and get a say as Dankworth.

[00:36:52] You know you get that big digital analytics power our Twitter bomb you know how that happens. You know we’re no anyways. Let’s let’s do round robin and add some closing thoughts on this show.

[00:37:06] I feel like we’ve just scratched the surface though so we need to do this. We need to do probably two more hours together and I will bring some examples next time and we can pick them apart.

[00:37:16] I feel like we’ve been saying we need to talk about testing from like episode like 5 onward. And I think we’ve we’ve proved that it could almost be every other episode. So I think scratch the surface is a good way to put it. I’ve also learned that maybe if I shut up and let them ask a question once or once or twice that they’ll actually be you know labeled good and good questions with it they’ll get insightful responses even if the questions are still horrible.

[00:37:43] Kelly and I have been practicing that question all week.

[00:37:46] I’m just glad I got it.

[00:37:50] Yeah this has been I feel like we’ve bounced around a lot. And every time Kelly was talking I had like wow that’s a really good point. That’s a really good point. That’s a really good point and I have seven other questions. So yeah I feel like it’s on the one hand you could walk away from this being really intimidated by testing which is probably not our intention at the other hand. On the other hand that’s going in with eyes wide open that you know don’t buy. We know if the tool says just get the tool and the rest will come or that’s a load of crap. There’s a lot to be a lot of thought needs to go into it. And yeah my biggest I say take away because I feel like it’s just because it’s such music to my ears. The think about what you’re doing before you know just do it. It’s not Nike for me. Just go execute and good things will come. It’s actually makes sense to put some thought and planning into. What you’re doing why you’re doing it. You know who you need to help you do it. And don’t shy away from stuff just because it’s going to require legal approval.

[00:38:52] So it could be high impact and be worth the worth the investment which is not to say that Nike doesn’t have an excellent program.

[00:39:03] What they do was there was a reference to their consumer slogan not to.

[00:39:13] So this is a fun chat and to me that’s really all it was this time and I think we need like Tim was saying we need another two hours. We did a really good job building a baseline today for people who were new to testing the only particular takeaway that I shared because I’d never thought of it before today is that we talk often about how hard it is to find an analytics professional you know like I’m a recruiter or a company that wants an analytics person. I wonder at you know how much harder it is to turn over Iraq and find a testing person lightning in a jar.

[00:39:47] Yeah.

[00:39:48] It’s you either build them or they’re not a lot. Definitely not a lot. Yeah. BARRY Khalaji. This is my big takeaway this was fun. Let’s do some more yeah.

[00:39:59] And for me you know I think definitely one of the things I’m taking away is thinking a little bit harder about how my heute kind of the doctrine of have to have this confidence level. You know I kind of joked about it but typically will always look for a specific confidence and haven’t historically kind of looked at it from the perspective of impact or complexity or you know. So those kinds of things in terms of how we how you might be able to raise or lower confidence I think that’s really that was something that was kind of new for me. So that was that was awesome. And yeah I reiterate the third time I’m you know just how little I feel like we were able to cover and so but that just means we need to do more of the same. So that’s good. And I know Kelly Any parting thoughts from you are.

[00:40:49] Well it’s a little ironic I feel almost like you guys all set me up for this when you don’t even know it. But I have a twice monthly teleconference group that discusses you know just one piece of the topics that we covered tonight each time in an hour long conversation and all of you are of course invited to join us as all any of your listeners. Just shoot me an e-mail. It carried out worse than the dot com and I’ll add you to the invite. For those of you who have been to exchange.

[00:41:22] The whole idea was to take the exchange huddle session and put it into a teleconference. So you have people from all over the industry from all different areas of different verticals and different levels of sophistication. Some people just built the program. Some people would call Best in class and they all get on the phone together and we just you know pick a topic and kind of do a deep dive on it and learn from each other and it’s really powerful and as you as you mentioned you know feels like you just scratched the surface and in this you know 30 minute conversation it’s really nice to have a full hour to just pick one topic and an idea.

[00:42:04] So did you actually invite the three of us to crash your nice conference call.

[00:42:09] Absolutely did you and all your listeners.

[00:42:13] That is all right. Oh that is such a great resource and I think you will probably get a few e-mails from folks and probably from me also wanting to get a chance to listen in on the next one of those. And so if you’re listening that is something for definitely rewind that part get Kelly’s e-mail and I think you said it was Kelly. Were them at dot com.

[00:42:38] That’s right. I’ll be here for that. Michael just restated it for you so now you need to just say slow. One more time.

[00:42:50] This one is so crazy how fast the time went tonight.

[00:42:57] I don’t remember recently one of these going so quickly like I. Anyway this is great again if you’re listening and you know you want to get more information or or follow up have questions I think we all know who we need to talk to or who is semi twice monthly teleconference you need to join but definitely cover your Facebook page and tell us how that went.

[00:43:24] Yeah exactly. No definitely heads up on our Facebook page or Twitter or on the measure slack and we’ll do our best to get Kelly involved with measures lac and stuff like that although she seems pretty busy based on just even how long it took us to get her to get on the show. Part of that was our fault. But man was it worth the wait. So Kelly thank you once again for everyone listening. We’d love to hear from you and get out there formulate a great plan and then go test against that plan and really make things happen.

[00:43:58] Don’t just learn stuff. All right. Well obviously for Mike cohosts Tim Wilson and Jim Kane Michael Helmly keep analyzing.

[00:44:10] Thanks for listening. And don’t forget to join the conversation on Facebook Twitter. We welcome your comments and questions Facebook dot com slash and on Facebook now or on Twitter. We made up. Words.

[00:44:32] And what you’ll need to do is just interrupt him whenever you want to talk. Gina McCain is the person who then goes after Tim Wilson for talking too much so I’ll have to that’s kind of his long function on the show. You get Allouni in a toonie. You got some money on off me. I’m getting to write massive headshake from Hellboy like holds what I do. First it’s a great place where I feel free to express myself. I neither confirm right denied it. We are exceptional man. And again I.

[00:45:19] Think Yuji.

3 Responses

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#243: Being Data-Driven: a Statistical Process Control Perspective with Cedric Chin

#243: Being Data-Driven: a Statistical Process Control Perspective with Cedric Chin

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_243_-_Being_Data-Driven__a_Statistical_Process_Control_Perspective_with_Cedric_Chin.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares