068 – Why User Adoption of Enterprise Data Products Continues to Lag with International Institute for Analytics Executive VP Drew Smith

Experiencing Data with Brian O'Neill (Designing for Analytics)
Experiencing Data with Brian T. O'Neill
068 - Why User Adoption of Enterprise Data Products Continues to Lag with International Institute for Analytics Executive VP Drew Smith

Episode Description

Drew Smith knows how much value data analytics can add to a business when done right.

Having worked at the IKEA Group for 17 years, Drew helped the company become more data-driven, implementing successful strategies for data analytics and governance across multiple areas of the company.

Now, Drew serves as the Executive Vice President of the Analytics Leadership Consortium at the International Institute for Analytics, where he helps Fortune 1000 companies successfully leverage analytics and data science.

On this episode of Experiencing Data, Drew and I talk a lot about the factors contributing to low adoption rates of data products, how product and design-thinking approaches can help, and the value of proper one-on-one research with customers.

In our chat, we covered:

  • 'It’s bad and getting worse': Drew's take on the factors behind low adoption of data products. (1:08)
  • Decentralizing data analytics: How understanding a user's business problems by including them in the design process can lead to increased adoption of data products. (6:22)
  • The importance for business leaders to have a conceptual understanding of the algorithms used in decision-making data products. (14:43)
  • Why data analysts need to focus more on providing business value with the models they create. (18:14)
  • Looking for restless curiosity in new hires for data teams — and the importance of nurturing new learning through training. (22:19)
  • The value of spending one-on-one time with end-users to research their decision-making process before creating a data product. (27:00)
  • User-informed data products: The benefits of design and product-thinking approaches when creating  data analytics solutions. (33:04)
  • How Drew's view of data analytics has changed over 15 years in the field . (45:34)

Quotes from Today’s Episode

“I think as we [decentralize analytics back to functional areas] — as firms keep all of the good parts of centralizing, and pitch out the stuff that doesn’t work — I think we’ll start to see some changes [when it comes to the adoption of data products.]” - Drew (10:07)

“I think data people need to accept that the outcome is not the model — the outcome is a business performance which is measurable, material, and worth the change.” - Drew (21:52)

“We talk about the concept of outcomes over outputs a lot on this podcast, and it’s really about understanding what is the downstream [effect] that emerges from the thing I made. Nobody really wants the thing you made; they just want the result of the thing you made. We have to explore what that is earlier in the process — and asking, “Why?” is very important.” - Brian (22:21)

“I have often said that my favorite people in the room, wherever I am, aren’t the smartest, it’s the most curious.” - Drew (23:55)

“For engineers and people that make things, it’s a lot more fun to make stuff that gets used. Just at the simplest level, the fact that someone cared and it didn’t just get shelved, and especially when you spent half your year on this thing, and your performance review is tied to it, it’s just more enjoyable to work on it when someone’s happy with the outcome.” - Brian (33:04)

“Product thinking starts with the assumption that ‘this is a good product,’ it’s usable and it’s making our business better, but it’s not finished. It’s a continuous loop. It’s feeding back in data through its exhaust. The user is using it — maybe even in ways I didn’t imagine — and those ways are better than I imagined, or worse than I imagined, or different than I imagined, but they inform the product.” - Drew (36:35)

Links Referenced


Brian: Welcome back to Experiencing Data. This is Brian O.Neill, and I have Drew Smith from the IIA here,. Drew, how’s it going?

Drew: Very good. Thank you for getting the acronym in the full name. Now, I don’t have to say International Institute for Analytics—

Brian: Yes.

Drew: —but I just did.

Brian: It’s a lot of words in there. And you run the Analytics Leadership Consortium. What is that?

Drew: Why not throw in even more words and more strange words? So, the analytics leadership consortium is a closed—a roundtable of analytics executives from different companies in different industries who meet regularly to discuss both analytics best practices and analytics innovations. And really happy to be here.

Brian: Yeah, yeah. It’s great to have you. We’ve been chatting for a while, I’ve known IIA for—I have to actually give some props to IIA because—actually, it was David Alice. I remember meeting him in the hallway at Strata, New York, I think 2017, or something. It was my first big speaking gig; we got lost in the hallway and met. And he was really interested in the work that I was doing because I was very much in the seed stage of figuring out, does anyone care about human-centered design and user experience in this world of internal analytics, not in the software space so much but in the enterprise space.

And he seemed really interested, and so I appreciate that you guys were willing to amplify some of the work that I was doing, and my ideas and stuff. So, props to IIA for that. And we’ve gotten to know each other in the Data Strategy Show, and Samir, and a bunch of stuff, so I wanted to have you on here to talk a little bit about your perspectives on, I think strategy is kind of misunderstood, and at the same time, we’re having this big problem with adoption of data products. So, the things that data science and analytics teams use, whether they’re applications or dashboards or models, or all these things that were going on in the factory, and yet 20 years later, the surveys I read and the stuff I hear over the phone, in calls, and clients and seminar students, there’s still low use. Like, what is broken? [laugh]. Why is this so broken after, now, decades? It’s like, you just changed the marketing terms for these things, and it’s still broken. What is broken? Why is it broken?

Drew: Well, sh—let’s start with the simple questions.

Brian: [laugh].

Drew: Yeah, but you know what, I am going to take you on further. It’s broken and getting more broken-er. So, it’s bad and getting worse. That’s at least what the surveys say. And it’s worse from two vectors: so it’s worse because—[unintelligible 00:03:01] maybe several vectors.

So, let’s start with the fact that people use data-informed products to make their life better every day. When they were allowed to travel, it’s how they got to the airport via Uber. Now that they’re not allowed to travel, they’ve forgotten how brilliant the algorithm is that recommends the next Netflix show and all of that stuff. And even things they don’t know has happened, whether it’s changes in what kind of toilet paper is being advertised to them. So, people are really using these products and appreciating them as products or services, maybe not knowing they’re data-driven, but that doesn’t matter.

So, that is creating a little feeling for people that this should be easier. How come I can get an Uber faster than I can get my sales figures? This doesn’t make sense. So, there’s just a little cognitive dissonance between what I live in my life and what I live at my work. The other reason it’s—so that makes it an expectations gap.

So, now I think you in my internal analytics team should be doing much better than you are, and then the other thing is, I think people have become focused on the wrong things; I think they’ve become focused on the volume of data, or the diversity of data, or any of the [unintelligible 00:04:17] you want to use. And they’ve been focused on the technology because they think and they’ve been told that technology will save them. A larger cloud footprint will save you, a new data tool: DataIQ, Databricks, DataRobot.

Brian: I thought Snowflake was saving the world right now, as of May 4th, 2021. Aren’t they the ones saving the world?

Drew: Well, they’re saving the world and making a whole lot of people very wealthy. So, you know when—

Brian: [laugh].

Drew: —too. No, but you’re right. So, Snowflake is a great example. And actually, I know people who use Snowflake; I know Snowflake people. It actually is a pretty good technology, but it is amazing how people obsess about this stuff.

I think it’s really fascinating even listening to leftist hippie radio, NPR, there’s advertisements selling AI software. So, I was driving the other day and I was feeling just so badly for my clients who must have to have someone in the marketing team come in and say, “Oh, you know what I heard on NPR today in between the weather and a news report about the Biden administration was that now we can do AI everywhere in the enterprise?” Like, “Please give me AI everywhere in the enterprise.” So, you have a lot of growth in the use of genuine products of value—Uber, Lyft, whatever you want to pick—you have a growth and awareness that this should be something we should be able to do, and then you have a wrong focus, in my point of view, on the technology that does the thing. And as a result, I think people are saying, “I’m not sure about this data and analytics thing anymore. Five years ago, you told me it would be big, and it hasn’t been big yet, and I’m a little skeptical.”

So, we see that in the figures. There’s a couple of different people who are not IIA who has reported on this, and, you know, Randy, being a new Vantage Partners, and some other people, we have it in our own internal research, but we tend to do client-requested research for clients only, so I’m not going to go out and say who has said it’s getting worse, but we have indications that it’s getting worse, that this whole data products don’t get used is becoming worse. And that’s something as an industry, we need to take responsibility for.

Brian: Do you think that you have to wait for a crop of people to lose their job and then this changes because a new crop of leaders will come in and say, “The technology’s not going to save us?” Or is there a learning that—I mean, at some point, they’re going to stop funding these initiatives if they don’t start returning some value, and from a design lens, the way I always see this is, there’s two hops. There’s, really, the hop from no one uses it to somebody uses it and then there’s someone uses it to value, and you can’t bypass the middle one. And a lot of places are just getting from the zero to one, not the one to two challenge. So, how do we get to zero to one?

But is this an incentives problem? Is this a leadership problem? Because someone needs to invest in zero to one, and if the leaders are the ones that are looking at technology to save every one of these initiatives, then what’s left, I guess? Because they’re the ones buying the stuff; they’re the ones saying, “I’m signing on the line.” And no ding to Snowflake; I don’t even know anything about Snowflake. I don’t care about that. But the point there is they are also the ones buying these services. So.

Drew: Yeah, I think actually, I firmly believe—and it was interesting, I was talking to somebody who I’d never talked to before and we came to sort of a similar conclusion. I think the way it’s going to get fixed is I think you’re going to have a decentralization of analytics back to functional areas. And why I think this will fix it, and why I think this will help us get from zero to one, and then from two to twenty and two to two hundred is a couple things. One, I think the thought of scale meant you went from zero to one to two to two hundred thousand when in reality, that data product could never be good enough for two hundred thousand people. It’s good enough for two hundred people.

So, there is a resetting of expectations when it comes back to functional analytics. When functional analytics comes back to being the thing, what you’ll find—and I don’t mean specifically functional analytics back to silos; we could talk about that in a sec, but what I mean is the person working with the analytics people is much more likely to the person says, “This is my business problem. This is what I’m prepared to do differently with my business process, based on the data and the insights I might get through your data product,” and to be much more active in the process of creating the data product to start with. I think when that happens, maybe it’s the distance between the zero and one gets shorter, and the distance between the one and the two gets shorter, and the distance between the two and two hundred gets shorter. So, I think what’s going to have to happen is people are going to have to then re-engineer a bit of their technological landscape.

And in fairness, some of the structure is already there in a good way. We’ve already battered Snowflake, but we can say nice things, which is that they’re actually relatively easy to access data in Snowflake from different functional organizations, so you don’t need to reorganize, re-engineer your data pipes. It’s the same thing with much of the cloud infrastructure. Then it will be interesting to see if functional areas take advantage—and I mean that in a very positive way—of the things that central analytics have done but hasn’t yet materialized in value. And specifically, those things that they have done but hasn’t gotten value yet is they’ve built data infrastructure, they’ve built metadata, they’ve built a way of working with data, they built different pipelines depending upon the latency.

They’ve done things that really are needed to be done, it’s just that they haven’t yet returned any value to the person who really needs to change the way she makes a decision. So, I think as we go from a centralized to a functional, and the firms that keep all of the good stuff with centralizing, pitch out the stuff that doesn’t work, I think we’ll start to see some changes there. And of course, companies are really weird organisms, and the bigger the company, the weirder the organism. If one function manages to leverage all the power of the central and make it more user friendly for the function, that pattern we’ll be seeing, that pattern will repeat as people see; they want to be as successful as, say, the marketing department, or the product department, or the supply chain department.

Brian: As you were talking about this, I guess I wonder about the—and I’m not saying customers shouldn’t have some level of self-service access to information, but I wonder if that overall strategy of, we need to open up data Home Depot in all these different neighborhoods, and as long as there’s a Home Depot there, people will make things. But it’s like, well, there’s a lot of contractors that actually shop at Home Depot. It’s not—a lot of homeowners aren’t going in and building an entire garage themselves, so I always wonder, if we decentralize and just change the management, is there still not a skill gap that needs to be filled? And should the data scientists, and the analytics leader, and the BI consultants, should they be the ones that are doing the work to help the business people frame problems, to model decisions, all the upstream work that happens before you write any code or you do anything, it’s just to understand what decisions are we going to make? How do you make them now? What’s it like to be you all day long, dear finance controller or whatever? Does that work still need to happen even if we decentralize?

Drew: Yeah, I mean, it still needs to happen, and I think for a couple of reasons. First, I’m going to admit I had trouble tracking your question because I’m in love with the analogy. I think it’s frickin brilliant. I think the Home Depot one is really truly brilliant because of exactly what you said. There’s people like me, who—I’m okay. I can hammer a nail or two.

Brian: Yeah, yeah.

Drew: But if you said to me, “Go to Home Depot so you can replace your windows.” Oh, no way. No way. Rain comes in the windows. I know what happens if I do that wrong.

So, I think you’re exactly right that the environment needs to be built—and can be built, by the way—to cater to hobbyists and craftsmen alike. It can be built to enable you to make a better lawn and to add an addition to your hou—someone add an addition to your house. I think your allegory is great. I think the key thing, though, is you still need to know what problems people face in order to structure that environment correctly. That’s what I meant when I said central analytics should stay because there still needs to be interaction with the data engineers, and the data scientists, or data analysts within the functional team.

And there needs to be somebody who says, “Oh, these three functions are typically asking for this kind of data at this kind of frequency at this kind of granularity. Okay, good. Central team, we’re going to take care to shape that up in a good way, and we’re taking it at face value that you as the data person in a function”—and I’m trying to avoid titles here—“That you as a data person in the function have understood well enough the business problem, and you understand well enough the data and the infrastructure to engage with us in the data engineering or the central data team to make the right decisions.” But you’re totally right that the starting point of that if statement is if I understand your business problem well enough and I understand enough about data, I can start to build out the infrastructure to answer your business problem. But the starting point of this [unintelligible 00:13:55] is if I understand your business problem well enough.

And I do think there’s still work to be done there. And the one thing I reject is that there’s only one way to do that. I think if you talk to a consultancy that uses a lot of blue font, they’re going to tell you that you have to have an analytic services translator; if you talk to a big consultancy that uses only green font, they’re going to tell you that you need to clean your whole house out and bring only digitally native people in, to be extreme about it. But you can do either of those things. You can do rigorous upscaling programs, you can do continuous learning; you have to decide how you want to bridge the gap between your business and your data knowledge, but you have to acknowledge there’s a gap and then you have to work to bridge that gap.

Brian: If you and I are those opposite sides, and we’re crossing the GW bridge, and I’m coming from Jersey and you’re coming from New York, who needs to come farther over that bridge? Who has more work to do to meet the other person somewhere, or does only one person crossing—a little New York [unintelligible 00:15:00] analogy because I was thinking about, like, Home Depots in, like, Chelsea, and East Village, and Brooklyn. We got them all over now, everything you need. So, that’s what I’m using, I guess, a New York in my head.

Drew: I’m schizophrenic on that one. I’m going to admit to schizophrenia on that one. I’m going to answer the question in two different ways. Because my first schizophrenia is I’m willing to push business people. I’m maybe not schizophrenic.

I’m a little more aggressive on this one. I’m willing to push business people a little further than many data people are and nearly all business people are. What I mean by further is I’m actually willing to say that some of the more active decision-makers in some of the more data-driven functions—and by that I mean, marketing, sales, supply chain logistics—I’m willing to say those people need to conceptually understand the different types of algorithms or models, or whatever term you want to use, that might make their business better, and why. I’m not saying they need to understand Bayesian statistics; I’m not saying they need to understand exactly how to program a nearest neighbor. But I am saying they need to go at least understand a few things.

For example, well, if I get more data, and I run a more sophisticated algorithm, it’s likely to take more time, it could be a bit more fragile, but I might get deeper insights. They need to at least understand that conceptually, that’s a trade-off against, if I run a rather quick and easy and dirty, simple algorithm, you know, linear regression, up-down algorithm, whatever it is, I’m going to get it approximately right but I’m going to get it approximately fast. So, I’m willing to push business leaders to say they need to go a little further down that data bridge than some people say they need to do. And at the same time, I’m willing to push analytics people a little further down the bridge, and where I push them further than most is, it’s all well and good if you say, “Oh, I understand that we’re having trouble reaching the target demographic of 18 to 49-year-olds who buy sneakers more than four times a year.” Whatever it is, right?

People think, “Oh, then I understand the business.” No, I think you need to understand how is the marketing department willing to change to reach those people? Because eventually your data product, if done properly, will likely challenge the way they work. And they might be really interested in branding and being the next coolest thing, Instagram influencers, whatever is cool, I don’t really know. But really, your thing might say, “No, you need to take your prices down 20%.”

And they might say, “I’m never taking my prices down.” You need to know that, you need to be prepared to engage in that discussion in a tough battle. So, in the answer to your question, I take the cheap way out and I say both sides need to drive further from New Jersey or New York to meet in the middle, and then if we want to really abuse this analogy, they need to overlap. There needs to be a lot more overlap, I think, than there is today.

Brian: Can I reframe that question?

Drew: Yeah. Because you don’t like my answer, so you’re looking for a better answer?

Brian: I hear you. Here’s a more helpful answer, maybe our listeners. So, assuming listeners here are primarily leaders, and data science and analytics teams, and they’re not coming from the business side, what would be the signs that the problem is really more us not them, versus, “Wow, no. We’re doing pretty solid here, and they need to change.” Which sounds terrible, but the point being, my effort needs to be deployed in a different way. Is there a way to know? Because I think we all tend to think that it’s everyone else’s job to change. We kind of joked about this, right? I love change when it’s someone else that has to do it.

Drew: Exactly. Okay, I understand why you reframed the question I have maybe given our audience—I’ll say it’s our; I’m very proud to be on your show—I have maybe given our audience too much fodder to say, “Oh, it’s the business guy’s fault. This guy Drew, he said it. He said they need to understand what an algorithm is, and so now I got to go yell at them.”

Brian: [laugh]. Now, they can send this episode to their business [crosstalk 00:19:18].

Drew: Exactly. Just—and they’re def—

Brian: [laugh]. It’s your fault.

Drew: —they’re technology, so they’re just going to snip out that part. We’re all done. No, it’s a fair judgment. And I think I talk to business leaders and senior executives probably more than I talk to model builders and serious people who have every right to call themselves ML engineers or something. So, I understand that I might be misconstrued.

And I’ll say it a different way. Let’s go back to what we said. It’s 20 years we’ve been doing this, and we’re still just talking about the potential, mostly, and not talking about things getting actually used. So, we better all accept that everyone has a lot of blame to shoulder. So, data people need to accept that the way that they have been working is not working.

And the primary things around the way they have been working is not working is by obsessing about things about the model, when they need to obsess about utility of the model. So, I would, if I had money to invest and I had a data team that was good at building really sophisticated models with really high levels of precision that every once in a while worked—or every once in a while got used, rather—or I had money to put on a fast and scrappy bunch that got models in production that were pretty much better than the way the business was doing now, I’d invest in the scrappy bunch all the time because those people have a couple of characteristics. First off, they’re really obsessed about did the business get better because of what we did? Did the business get better what we did? They don’t worry necessarily about the precision of the model outside it being ethical and accurate in that way.

And the other thing is, they’re always going, “Why did that work? Wait, why did that work? Why did that not work?” And that’s the kind of data people we need is the people that go, “Wait. I know. I’m brilliant. Cool. That’s fine. That’s all good. But it didn’t work. Why didn’t it work?”

I love the why question. I’ve often said that some of my favorite data people when I worked at IKEA had the attitude of a three-year-old. “Why? Yeah, but why? Yeah, but why? Yeah, but why? Okay. Now, let me go build this thing, and this is what you said, right?” “No.” “Okay. But why isn’t it what you said?”

And they don’t get worked up about it, they just go back and keep building. So, I think data people need to accept that the outcome is not the model, the outcome is a business performance which is measurable, and material, and worth the change. That’s the other thing. That’s what I mean about knowing that process. You have to know that this is worth the change, worth the change in the way I work, worth the destruction of my ego, worth whatever, but I have to build something genuinely worthwhile.

Brian: Yeah, you’re preaching to the choir here. We talk about outcomes over outputs all the time here, and it’s really understanding what is the downstream thing that emerges from the thing I made. It’s that nobody really wants the thing you made; they just want the result of the thing you made. But we have to explore what that is earlier in the process, and the why questioning, very important. We call this laddering in the design space; that’s how I learned is called laddering.

And you’re always laddering up until bam, and you will know when you get there. You will hit something in the conversation that they haven’t ever said before, or it’s never been written down before, and then you start back coming down. You’ve finally gotten to the root of, like, “I’m trying to get more customers,” or, “I’m trying to stop wasting money buying the wrong quantity of stuff for my widgets, and I can’t keep stockpiling this,” and whatever. You hit it and then you can start coming back down about into solution mode and that kind of thing. But these skills I feel are not readily hired for in the space.

So, is this a training thing? Is this a bring different kinds of people in thing? Is there a—the teams that you talk to that are doing this well, what made it scrappy? Do I—I’m looking for the word ‘scrappy’ in my resumes? Am I looking for someone more interested in learning? How do you find scrappy people? What does that look like?

Drew: Yeah, hiring for scrappy. That’s the next blog post? It’s a great question, and I think I see it in my mind’s eye more than I can get the words out because it’s this sort of restless energy you see in people. It’s this light in their eye. I have often said that my favorite people in the room, wherever I am, aren’t the smartest, it’s the most curious.

Brian: Yeah.

Drew: Right? So, if you’re hiring a data scientist, and they aren’t asking questions like, “Who’s your worst business user? Why?” Or, “What’s their biggest challenge? What’s the biggest market opportunity you haven’t hit? What’s the thing your customers love about you? What’s the thing your customers hate about you?”

If they aren’t asking questions that are oriented towards your business, and they’re just asking, “Can I use R? Can I use Python? Do you have Snowflake?” Like, they should ask for things, that’s okay. That’s fine. They want to know that they’re going to be in an environment which they feel capable and able, and they’re going to have fun in, and all that stuff, but man, if they aren’t interested in what problem are you trying to solve.

What have you tried before? What has worked? What hasn’t worked? Who can I talk to? Who in the firm is successful? I saw this thing in GitHub, is this what you’re talking about? You want this sort of constant why, right, the toddler thing. Why? Why? Why?

So, hiring for scrappy, maybe scrappy is not the right word, maybe curious is the right word; constantly engaged. In some sense, you also have to nurture that curiosity as well. I work with some universities, we do some information exchanges and stuff like that, and I think most of them are getting better at this, but about five years ago in data science was just this thing, which was, of course, your university, you’re going to have a degree for data science because it’s just going to have tons of enrollees. They were getting the wrong questions from their business community partners about, I need somebody who can do R; I need somebody who can build ML, dadadada.

And that’s what they gave them. So, us the industry, we went to our university partners and said, “I need more data people.” And so they manufactured data people in MBA programs and certificate programs. And then we went, “These people are horrible. [laugh]. They don’t work.”

And people have recently gone back to universities, and I know a couple of universities who put a lot of emphasis on design thinking, one of our favorites, right? The ever, ever, ever-present Bill Schmarzo, but other universities as well are really going, “Oh, wait a second. They definitely need to know how to code. They definitely need to know how cloud works. They definitely need to know R and Python. All good, but unless we can find a way to help them nurture that curiosity to be extremely interested in business problems”—it’s one of the reasons I think, actually, successful data scientists can also be found in those hackathons.

Look for the person that came up with a creative solution, even if it failed miserably. They totally forgot the technology bottlenecks, they totally forgot that, but they were eminently interested in a creative solution. I think that’s another opportunity as well. So, I think it is training. I think it is hiring.

And I think it’s giving our feedback to our academic partners that we overcorrected for hardcore data tech chops, and we need them to come back and give us some—God help everyone—humanity studies in the data world.

Brian: Having been in the space for many years at this point, and not coming at it from a technology side, my general feeling of the clients I talk to and people is that nobody has a shortage of the right technical intelligence or tools. For the most part, that’s all there. Maybe they could have more muscle, but they already have muscle; they’re strong there. It’s deploying it in a way that produces some desirable change. That’s usually a change someone else wants because most of these teams are in a service, they should be in a service mentality.

We’re here to serve others to make their work better, the sales team, the marketing, the finance, or whatever. We’re not here in and of ourselves. But that’s not happening, and so the some of these questions that you’re—the things you’re talking about, I think are really important. And analytics don’t ever really answer the why part of this kind of research, and you have to go out and do one-on-one ethnographic research. You need to live with a salesperson for a while; you need to go see what does the supply chain manager do all day long.

If you’ve never done anything with supply chain, the chance of you repeatedly delivering value to that person is going to be very low. So, my perspective is that how can you possibly, even if you’re smart, it’s not—if you’re so smart, you should understand how the chance of you doing that well is going to be really low because you know what you don’t know, which is nothing about supply chain stuff. So, go over there and ask them to spend a day together and understand what it’s like. How did you make a deci—“How did you know how many clips to buy for your widget last year? How did you know that 2 million was the number that we need for our other widget that we make?”

They made a decision somehow. That curiosity, I think, has to be there in order for you to optimize, and if you don’t care, it’s just like… I don’t know, [laugh] is this crazy? But I just, to me, it’s so obvious that you can’t possibly repeatedly do great work if you don’t know who you serve, and you don’t know what it’s like to be them, and you don’t know what keeps them up at night, and you don’t know how to make them shine, and realize that if they’re happy, they’re going to say, “Yes, please. Give me some more. Can you also do this? Can you make it faster? Can you make it better? Can I do it on my own without you?” Those are great signs.

Drew: But they’re not easy to get to. I think you said a couple of real gems in there. I want to pull a couple apart, really, right? So, firstly is, to all the data folks out there who are definitely better at data science than I, and programming and all that stuff, I will say a small contradiction against what you said, Brian. You actually can be a good partner, you can be good.

I mean, you’ll be exceptionally average for a very long time, but you will never be great. You will never ever, ever, ever be great until you deep-dive and get that understanding of the drivers of that person’s satisfaction or dissatisfaction. So, I think that the challenge is, what I would say to people is if you feel like you’re doing a good job, think about what we talk about all the time, which is the potential of data to transform. So, how could good be transformative? It simply cannot be transformative.

And the missing variable is what you say, which is going to the end-user—I don’t mind using ‘internal customers,’ I think it’s fine, but whatever you want to use, doesn’t matter—the people who really suffer on a daily basis to this. And the other thing I’ll say, I don’t know what your experience is, but we did this a couple of times; we did it when we were very small central organization. We would go out to country organizations to learn more because sometimes it operates diff—companies operate differently at different countries, for whatever reason, or functional organizations. You’re going to have fun, actually. You’re going to have a lot of fun because there’s two things.

First off, if you’re being sent there, chances are that people need your help. How cool is that to go in and to be able to help someone? The other thing is you’re going to, if you’re a data person, and you’re probably a very eager learner, so you’re going to learn things you don’t know, and there are questions you don’t even know to ask. So, how cool is that? So one, it’s the path from good to great is actually to get integrated.

Two, it’s a crap ton more fun. Three, if you want to be super career ambitious about it, the beautiful thing about being a data person, you’re normally working with people who are three or four levels up the corporate hierarchy to you. So, why not learn what it’s like in that rare air? Why not learn what it is really like to own a P&L and to be responsible for a large amount of headcount because chances are, you’re smart, talented, educated, and ambitious, and you might want to continue to be that in the data realm, but if you are going to be that in a data realm, eventually, you’re going to be responsible for managing people, you’re going to have some component of a P&L ownership, and the sooner you learn that, the better. So, get some really cool, smart people, solve a problem for them, make them happy, and then go to them and say, “Hey, Brian, I really admire your career. Could you tell me a little bit about what you’ve done?”

And you would be amazed how much faster you will grow over the person who has just said, “Oh, I’m just going to continue to build bigger algorithms with more data and more sophistication.” So, there’s lots of reasons to do it, whether it’s the fun of an interesting question, whether it’s the emotional feeling you should get if you help someone—that should be something that should motivate most humans—or whether it’s furthering your career ambitions, and it doesn’t have to be one or the other; it can be all three. So, there’s no downside. I don’t understand where this hesitation comes from. And the only thing I might hypothesize is, be prepared to be humbled.

I know, for example, some of the early analytics products that I did, I thought were super smart, and then you can have a senior who’s been experienced, and said, “Yeah, but you forgot that.” And you’re like, “Oh. How could I forget that?” Well, you could forget it because you just didn’t know it. And now you know it. So, it is a little humbling, but it’s mostly hugely rewarding to get involved in the real heart of the business, I think.

Brian: Yeah. Well, it’s also, for engineers and people that make, it’s a lot more fun to make stuff that gets used. I mean, just at the simplest level, the fact that someone cared and it didn’t just get shelved, and especially when you spent six months, half your year on this thing, and your performance review is tied to it, it’s just more enjoyable to work on it when someone’s happy and says, “Wow, this is great. Can you do this too? Could you give us this?

Can you change the period? Can you give us a whatever?” That feels better not having to rewrite, not just tossing it. [laugh]. It’s like—so it’s a little almost self-interest there. You’re still looking at what it’s good for you, but that model is also about serving somebody else.

But if we have to start even at that level, it’s like, just do it because it’s better for you to see your work live. I talked a little bit about this product thinking approach and design thinking and some of this; can you tell me a little bit about, in your perspective, what does that mean? Or what’s the difference between that and maybe the status quo, or the regular way people do stuff? Why does it matter? To IIA? Why do you guys care about it at all?

Drew: Yeah, no. Sure. So, a few things. Firstly, is we are far from as well-educated, trained, and capable as you are on product thinking. So, we probably might use some words wrong for some of your audience and—

Brian: No, that’s okay.

Drew: —rotten tomatoes can come my way; it’s okay. So, a few motivations on product thinking. As you mentioned, you met David Alice in the hallway. I’m sure one of the reasons he was excited to talk to you is he’s jazzed on product thinking. His learning comes out of running this part of our business called the assessment business where we have a very large survey we do with these very big companies, and we do some benchmark studies for best in class.

And whether it’s companies we’ve examined ourselves or a company we’ve used some external surveys for best in class, companies that refer to and build teams around analytics products tend to outperform those in the marketplace by a pretty substantial degree. You can look at reference companies, whether that’s Amazon, or Starbucks, or whatever, it doesn’t matter, or you can just think about those examples we use earlier, right? Netflix, et cetera. So, there is empirical evidence that says a product orientation company will outperform a different. Now, let’s talk about what the different is.

What we often see set against the concept of a product is a project. So, a start, and a finish. A brief and a hand-off. All these sort of classic things. And we don’t even need to go into the whole waterfall curse word and all that stuff.

And what I see markedly different, and the reason I’m super interested in product thinking—maybe not even necessarily the regiment of it—product thinking is the notion that success comes when the thing gets used and the person who uses it says, “I’m happy to use it. Give me more.” As you said. So, that’s a virtue of a product for me, and you only get there if you think about the model as a product. The other thing is that a product, especially in a digital environment, but truly in a real environment as well, right?

I mean, I had a 1996 Toyota Corolla, and I can tell you the 2021 Toyota Corolla is slightly better than the ’96 Corolla. But in digital products, you can make those improvements not over decades, but over days. So, also product thinking starts with the assumption that this is a good product and it’s usable and it’s making our business better, but it’s not finished in that sense. It’s a continuous loop. It’s feeding back in data through its exhaust.

The user is using it, maybe even in ways I didn’t imagine, and those ways are better than I imagined, or worse than I imagined, or different than I imagined, but they inform the product. So, we like it because, one, we know empirically it works, and we like it because we think it instills that utility thinking into data teams in a regimented way. But if you want to take the next level down, those who are willing to be more serious than just product thinking can very quickly, through your work and other people’s work, find frameworks, team structures, role descriptions that will support that thinking. So, to us, it’s a relatively well-mapped-out field through software first, and then it found its way into data. So, it’s already a proven method that works, and uses data at its core, so why are you going to sit around and look for something else when you got something that works?

So, we’re big fans of that in that way. And we also think, if I had one more thing, and really interested in your feedback on my inaccurate answer, but I think that the role—role, not a job title; not a human, but the role of someone who maybe is a product manager, or even maybe over several products, is also really inspirational and really useful. Because that is the person who says, together with the user, “Yes, this thing works, but it could work better.” And especially when you start to think about data products with a plural on it, the beautiful thing is so many of those products have the same underlying infrastructure. So, a churn model versus a retention model in marketing is the same thing flipped on its head, so the fundamentals of those two data products are the same.

Slightly different math, maybe slightly different UI and orientation, all that stuff, but isn’t that great? Somebody looks at those two things and makes an improvement on one that makes an improvement on the other, and an improvement on one make improvement on the other, and so forth and so on. So, I’m really interested because I think it has an ability to solve this problem that we started with, which is why don’t things get used? Because people start with the wrong mindset. But if you start with a product mindset, I think you’re in a way better position.

Brian: I do too. And I’m always suspicious of seeing the world through the lens of not only something that I consult in, but just because that’s the world I know and I live in, but when I hear companies are always—they’re envious of the digital natives, and the startup that’s going to eat their lunch, and all of this and I’m like, you copied everything about what they do with engineering and data stuff, and you copied none of the roles like product designers, UX researchers, and product managers who actually sit above all of that and make sure that there’s a value delivery that happens, that the strategy is right. Just as you talked about, which things go in the backlog? Is this going to support both the churn and the retention model? Am I getting love to one department here?

What’s the business cost of doing it that way? And I am seeing that change. This data product manager or product owner role seems to be emerging. Sometimes that has the word AI in it, which I think is okay in some places. I have some issues with that because it suggests a technique built into the title, which may not always be the thing you need.

But you [laugh] need the skill of product management, but you don’t necessarily need AI for everything. But that’s a minor detail, I guess. Do you think it’s somewhere as simple as that where they’re not copying all of the models of the way some software companies work, but they want to be like them? I don’t know if it’s that simple, but I do see that as part of it. Less about hiring the bodies as much as, like, rapid pace of learning, focus on delivering something, getting feedback.

And those companies, they all—everyone has their challenges with doing this model, but there is an interest in understanding the problem space, shipping small iterations, which aren’t the same as increments of work, but iterations of work, and the feedback loop of understanding why, and that hammering away at the problem space. It’s more that, as you said, it’s not the titles, I guess, and that the individual human beings as much as the roles and the responsibilities. Is that—anyhow, I don’t know if there was a question in there, but I wonder if they’re just half-copied the model, you know? [laugh]. No pun intended.

Drew: Yeah, exactly. I think, obviously, people are biased to pick and choose the things that don’t threaten them. So, if you think about product thinking, there’s really no threat to a marketing manager on product thinking because someone is just giving them help in a different way. I think what they haven’t been willing to do is go the full monty. And I do also caution all of us, myself included and you as well, software can be inspirational, but we should be careful it’s not a dead-on co—people don’t try to dead-on copy and paste. It is a slightly different world; the software is the product when in reality, a data product oftentimes informs the product or makes the product better. And so how do we want to think about that? I think people do need to put a little thinking into the adaptation—

Brian: Yeah, yes, yes.

Drew: —but I think, by and large, this goes back to what I was saying in terms of some things being pulled out of the central analytics function. I think that when we pull out of the central analytics function and we go more to a hub and spoke—which, by the way, is the model used at places like Amazon, et cetera—what you will find is somebody is taking more ownership for the outcome closer to how the outcome is used. And that’s what software does better I think, than the current way that it’s done in the productization of data analytics. So, I think there needs to be a little bit less of a insulation between the data product and then the person who uses that data product in their daily work. Then I think that two things will happen.

I think, first off, we talked a lot about the education of data people, but the truth is, also if you’re getting an MBA with a focus on marketing, you’re getting hardcore data chops, probably much more than somebody did as little as five years ago, so that person is going to come in and they’re going to go, “You know, I’m the product manager of the churn model. I need these following roles. I’m not a data scientist, so I need one of those. I need an ML engineer because I think we have the possibility to do it in machine learning,” blah, blah, blah.

And then the central data person says, “Eh, no. You’re not going to get an ML engineer because you’re”—that discussion can happen because the starting point was, “Yeah, I’m the product manager. I’m the churn model product manager, and I have demands towards the organization to support me with roles and structures that enable me to create that.” Right now it’s kind of pushing that concept towards people who don’t really see themselves as the product owner, if you will, of the churn model. No, I’m in charge of customer retention. “Yeah, I agree, and one of your roles is own the churn model.”

Again, it’s not a job title; it’s just a role. Yes, we need you to make sure you keep as many customers as possible, and if that’s calling people and saying, “We love you. Please stay with us.” Or if it’s whether you’re getting support from the analytics team to build out a sophisticated model to know who to call, don’t really care as long as you take responsibility for not only the big umbrella thing but the components beneath.

Brian: Yeah, yeah. Yeah, I’m with you. I think understanding that someone does need to own those decisions and that you’ll have a better outcome with that is the first step, whoever it is, wherever they report, [laugh] you know, all of that. I do wonder sometimes about certain business stakeholders’ ability to properly quote, “Product-own” something like a churn model if they don’t have some type of software background. They may have some data background, but if they haven’t built software before, I think there has to be a good handoff or a negotiation, or conversation between people that know how to do that in that role.

And it’s not something that they can’t learn it, but that could become your whole job, just moni—I mean, that—at the enterprise scale, like, that could be a standalone product; that could be an entire business, potentially. I mean, I’m seeing this happen with some of my seminar students. It’s just like, “We built this thing in-house, and we showed it to our partner, and they all want to use it now in their own stack.” And it’s like, that’s an entire startup business, just that one project you guys did. But that means you need all the support that comes with that, you need to think about—it has to have legs what—when it goes wrong.

There’s all those things you have to think about there. So, I mean, we could go in a whole ‘nother branch with that, but I don’t have you forever, and I wanted to—first of all, I love this conversation: it has been fantastic. But I wanted to ask you, do you—given that you’ve been in the field for a while. When you look back is there any big thing that you changed your mind about with analytics or data where you did a 180? Or maybe not even entirely a 180, but you really changed something? I’m always interested in how people make changes. But.

Drew: Yeah, yeah, changes is a fascinating subject. One little factoid on that one: I always love the fact that when people have really bad heart conditions, like, really bad, like, you’re going to die, two-thirds of those people do not change their diet or routine. So, the uptake of that is two-thirds of people choose death over change.

Brian: Is that a commercial? [laugh]. Is that your new product commercial? Two-thirds choose death over—[laugh].

Drew: What would you be selling? I’m not quite sure. But no, but isn’t that fascinating? So, the human brain, which we are still dealing with—all the data in the world, we’re still eventually hitting the human brain—has a tendency not to change. So, that’s to sort of say, the answer to your question is probably not as much as I should have, Brian.

I think the organizational model is one of them, and I’ll tell you that it was a little bit of reflection why I came away from strong faith in central analytics. And that is, I might have—this is the most evasive, political answer ever—

Brian: [laugh].

Drew: —I might have been one of those people who sometimes do—

Brian: [laugh].

Drew: —look upon building a large organization as a measure of success. So, central analytics organizations, they get up to, like, 200, or 300, 500, 5000 people. That’s a really celebrated role. I mean, you are at the top of the corporate mountain. But you’re not doing the company any good.

You’re starving the rest of the organization of data and analytics talent. So, I have made a bit of a 180, to say, that is a real short-term thing. You have to go through central analytics if you’re early in your journey because you do need to consolidate some critical things, data infrastructure, data governance, et cetera, but if you allow yourself to stay there, you need to sit back and reflect why you’re there. Is it your ego, as it might have been in my case? Is it a lack of trust to the business owners, which it also could have been? You need to do some reflection and figure out because it’s going to be unhealthy long-term. So, that’s one thing I flipped on.

One thing I’ve become more convinced of, actually—you didn’t ask that question. But I’m a politician, so I got to answer it anyway—I’m super convinced that there is this interesting, ironic thing where as the roles become more specialized, the leadership needs to become more generalized. It sounds ironic or paradoxical, or whatever the fancy word for it is, but I think when I talk to people who are able to—we have a couple of experts in the network who are able to talk really deeply about using very, very large volumes of data to build very sophisticated machine learning programs. They very quickly go into a space where—I’ve been in rooms with, like, 50 people, most of whom are smarter on this stuff than I, but they’re quickly—they’re gone. No one follows them anymore.

That doesn’t mean what they’re saying isn’t important and doesn’t add value because I’ve seen the output of their work. So, I think it increasingly becomes important that as that specialization increases, people who have a ability to get those specialized people to work effectively together—especially different personalities: and ML engineer and a product manager, or an ML engineer and a data scientist or data analyst or whatever, it doesn’t matter, UX, UI, that will become a really interesting skill set for me. And I always have believed that it has big potential. But in data, I think, get increasingly has bigger and bigger potential. And people should make that choice in their career: am I going to be the most brilliant ML engineer, or am I going to learn enough about these different roles such that I’m going to be a fantastic manager and enable those 3, 4, 10, whatever, people to work so well together that the sum of their parts is far exceeded by the whole; you know, that that classic trope.

So, I think that’s something I really am interested to see develop. And when I flipped on the organization, I’ve not really flipped on my obsession, ‘the technology will not save us’—I might have to come back to you because I think I must have changed my mind on more things and forgotten or did the usual human thing and pretended I always believed that other way. [laugh].

Brian: Right, right. “I don’t remember that I didn’t think that in the past.” [laugh]. Well, if someone wanted to challenge you and find out, where would they go to ask you about all your faults and the things that you were wrong about?

Drew: You mean group therapy?

Brian: Yeah, exactly.

Drew: Well, there’s lots of places to find me. The best place to find me is through my company email address, which is dsmith@iiaanalytics.com. You can find some blogs at iiaanalytics.com you can find me on LinkedIn as well, especially if you’re connected to Brian.

Brian and I are connected and often in the same thread. So, lots of places to find me. And I love a good challenge. I think one of the things I most love about interacting with you and a lot of our folks in the space is not shy of opinions; at the same time, not too arrogant to be unwilling to listen. And that’s a really enjoyable environment for me.

Brian: Drew, it’s been great chatting with you, and thanks for the work that you’re doing, and I’ll definitely link all that stuff up. Any last words before we call it a round?

Drew: No, it’s been fantastic. I really enjoy following everything you’re doing and I look forward to following more.

Brian: Great. Sounds good. Well, I’ll definitely—like I said, I’ll link that stuff up. iiaanalytics.com is International Institute for Analytics work. Check out the ALC that Drew runs, Analytics Leadership Consortium—not too many TLAs on this show. [laugh]. And with that, Drew, thank you again for coming, and we’ll see you on LinkedIn if not at a conference, maybe in real life someday. [laugh].

Drew: Ohhh. Looking forward to it.

Brian: All right.

Drew: Bye-bye, Brian.

Brian: Take care.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe for Podcast Updates

Join my DFA Insights mailing list to get weekly insights on creating human-centered data products, special offers on my training courses and seminars, and one-page briefs about each new episode of #ExperiencingData.