145 – Data Product Success: Adopting a Customer-Centric Approach With Malcolm Hawker, Head of Data Management at Profisee

Wait, I’m talking to a head of data management at a tech company? Why!? Well, today I'm joined by Malcolm Hawker to get his perspective around data products and what he’s seeing out in the wild as Head of Data Management at Profisee. Why Malcolm? Malcolm was a former head of product in prior roles, and for several years, I’ve enjoyed Malcolm’s musings on LinkedIn about the value of a product-oriented approach to ML and analytics. We had a chance to meet at CDOIQ in 2023 as well and he went on my “need to do an episode” list!

According to Malcom, empathy is the secret to addressing key UX questions that ensure adoption and business value. He also emphasizes the need for data experts to develop business skills so that they're seen as equals by their customers. During our chat, Malcolm stresses the benefits of a product- and customer-centric approach to data products and what data professionals can learn approaching problem solving with a product orientation.

Highlights/ Skip to:

  • Malcolm’s definition of a data product (2:10)
  • Understanding your customers’ needs is the first step toward quantifying the benefits of your data product (6:34)
  • How product makers can gain access to users to build more successful products (11:36)
  • Answering the UX question to get past the adoption stage and provide business value (16:03)
  • Data experts must develop business expertise if they want to be seen as equals by potential customers (20:07)
  • What people really mean by “data culture" (23:02)
  • Malcolm’s data product journey and his changing perspective (32:05)
  • Using empathy to provide a better UX in design and data (39:24)
  • Avoiding the death of data science by becoming more product-driven (46:23)
  • Where the majority of data professionals currently land on their view of product management for data products (48:15)

Quotes from Today’s Episode

  • “My definition of a data product is something that is built by a data and analytics team that solves a specific customer problem that the customer would otherwise be willing to pay for. That’s it.” – Malcolm Hawker (3:42)
  • “You need to observe how your customer uses data to make better decisions, optimize a business process, or to mitigate business risk. You need to know how your customers operate at a very, very intimate level, arguably, as well as they know how their business processes operate.” – Malcolm Hawker (7:36)
  • “So, be a problem solver. Be collaborative. Be somebody who is eager to help make your customers’ lives easier. You hear "no" when people think that you’re a burden. You start to hear more “yeses” when people think that you are actually invested in helping make their lives easier.” – Malcolm Hawker (12:42)
  • “We [data professionals] put data on a pedestal. We develop this mindset that the data matters more—as much or maybe even more than the business processes, and that is not true. We would not exist if it were not for the business. Hard stop.” – Malcolm Hawker (17:07)
  • “I hate to say it, I think a lot of this data stuff should kind of feel invisible in that way, too. It’s like this invisible ally that you’re not thinking about the dashboard; you just access the information as part of your natural workflow when you need insights on making a decision, or a status check that you’re on track with whatever your goal was. You’re not really going out of mode.” – Brian O’Neill (24:59)
  • “But you know, data people are basically librarians. We want to put things into classifications that are logical and work forwards and backwards, right? And in the product world, sometimes they just don’t, where you can have something be a product and be a material to a subsequent product.” – Malcolm Hawker (37:57)
  • “So, the broader point here is just more of a mindset shift. And you know, maybe these things aren’t necessarily a bad thing, but how do we become a little more product- and customer-driven so that we avoid situations where everybody thinks what we’re doing is a time waster?” – Malcolm Hawker (48:00)



Brian: Welcome back to Experiencing Data. This is Brian T. O’Neill. Today I have Malcolm Hawker on the line from Profisee. What’s going on, man?

Malcolm: Glad to be here, Brian. How are you?

Brian: Good. I’m doing good. We finally met—I mean, we met on the interwebs. I don’t know when that was. On LinkedIn, probably, and then in the flesh it, I think it was CDOIQ last year, 2023.

Malcolm: That sounds right. Yeah, that’s right. Boston. Last year—

Brian: Exactly.

Malcolm: —2023. In the Hyatt. Correct.

Brian: That’s right. I was actually just in the Hyatt two days ago, playing a gig.

Malcolm: It’s a small world.

Brian: I was like, “This is weird.” I talked to Malcolm in this room where I’m playing drums now [laugh].

Malcolm: You were playing drums in the Hyatt?

Brian: I was playing drums for a wedding. Yeah, I was playing a gig. And—

Malcolm: Oh, that’s cool.

Brian: And I didn’t even know. It’s just like, “Oh, where’s the gig?” And all this, and I go in the room, it’s like, “Wait a second. This is where CDOIQ was [laugh].” Anyhow, totally irrelevant to the podcast. But let’s talk about data products a little bit. We’ve definitely, like, gone back and forth, and I think we have a lot of alignment on—which almost was like, “Huh. What am I going to talk to Malcolm about because I don’t want this just to be like, a hoorah session where we agree about everything [laugh].”

Malcolm: Well—

Brian: So, when I was doing my questions, I’m like, “All right, how do I get some disagreement here, so I could dig into Malcolm’s head and see if there’s something in there that I don’t know that we can share with listeners?” But I wanted to broadly talk about data products, and particularly this idea of applying product thinking and design thinking to the work of building data-driven solutions. Is that even what’s your definition of data products is, and you want to maybe start with that?

Malcolm: Oh, gosh. Well, so I have a definition of a data product.

Brian: Oh.

Malcolm: We can start with that, and then we can go what—we can figure out what we can, kind of, drill out of my cranium to create some excitement here today. Because I think you and I do broadly agree on a lot of things related to data products, but I’ll tell you, Brian, my learning here has been a bit of a journey. Because as an ex-product owner, as a chief product officer, I ran a product organization for a software company. I have built and managed product teams, I’ve had product PNLs, I’ve hired product managers, I’ve hired analysts, I’ve hired pricing people, I’ve hired UX/UI designers. So, I’m a product guy, right, and when I hear the words ‘product,’ and ‘data product,’ and ‘product thinking,’ my mind immediately goes to that world, right? Like product management. “Oh, this world I know. That must mean they’re thinking these things.”

And when I hear somebody say, “Data product,” I just naturally assume that they mean things like UI, UX, like, they mean things like go-to-market, they mean things like lifecycle management, they mean all of those things. And it turns out, that’s not always what they mean [laugh].

Brian: [laugh].

Malcolm: But it took me about a year to figure all of this out, and having conversations with people who are kind of data product people, having conversations with, like, data mesh people, having conversations with data fabric people, and having conversations with all these folks to understand, “Ah, okay. When you say data product, you mean something completely different than what I mean.” But anyway, my definition of a data product is something that is built by a data and analytics team that solves a specific customer problem that the customer would otherwise be willing to pay for. That’s it. You’re building something, you’re solving a problem, and the customer would otherwise be willing to pay for it. That’s it. But that definition confounds a lot of data people for a lot of different reasons we can discuss more.

Brian: We definitely already share a lot. I have a notion of it’s slightly different around paying for it, but the fact that—I use this phrase that there has to be an exchange of value—

Malcolm: Yes.

Brian: Which could be giving up the old way of doing things. I’m willing to sacrifice my reputation and change my tool set or something because maybe I’m not literally, like, swiping a card to use it, but the spirit of exchange of value has to happen. Like, I will give up my old thing to use this new thing that you’ve made for me. I am totally on board with that. Can you maybe unpack your thinking around that piece of it, this exchange of value, this otherwise willingness to pay for it?

Malcolm: I will tell you, I came to that conclusion through comments you made on a post I had made previously about data products—

Brian: Oh [laugh].

Malcolm: —and you were adamant about the concept of value exchange. And I ruminated on it a while, and I said, “You know what? He’s absolutely right.” Because products for product’s sake, if there’s no exchange of value, if there’s no concept of value, then what are we doing, right? It just doesn’t matter.

And I think that maybe I had kind of fallen into the trap that so many data people fall into, which is the idea that value is incalculable. That it’s, like, some sort of Gordian Knot that cannot be unraveled, that value is this thing that sounds good in concept, but you just can’t compute because, you know, the value of data is second or third degree, it’s all indirect, there’s no direct correlation between data and revenue, and all these things. Which are complete baloney because we all know that improving our data will improve decision-making, we all know that improving our data will speed our business processes, we all know that improving our data will help transform our organization. So, we all know those things. It passes the smell test. But then the minute you start getting into, “Okay well, let’s quantify it,” data, people start to have an antibody response, and, “Oh well, we can’t do that. That’s impossible.” Like, this Herculean task that is simply beyond human science, which I just completely and totally disagree with.

Brian: I hear the same thing, and my general feeling is, like, you’re getting lost in the idea of precision and not measurement. And it’s a question of, like, “Well, did we save the company $10 this year?” “Oh, of course, we saved them $10?” “Did we save $100 this year?” “What are you talking about, Brian? Of course, we did.” “What about $1 billion?” “No, no, no.” “So, it’s between $100 and a billion dollars.” And gradually, you could start to get some estimates of value creation this way [laugh].

I’m curious your take, though, like, because I do know that a lot of people adopting this are struggling with this idea of, like, how do we know? Like, ultimately, you know, the marketing person’s going to change their behavior with this dashboard that has all this stuff we built in it to provide insights, but like, we don’t really know exactly when they take an action, and they’re not tracking that, so how do we know whether or not it really helped the ad teams build campaigns more efficiently, and not waste time experimenting on—spending the advertising budget correctly?

Malcolm: Yeah.

Brian: How do you think as a product person, or what’s your take on how to help measure these kinds of initiatives?

Malcolm: [sigh]. Wow. So, answering that question, I think could be easily an hour unto itself.

Brian: [laugh].

Malcolm: But as a product person, the first thing you need to do is deeply understand your customer needs. And in our case, the customer needs are to make more money, save money, or mitigate risk. It’s simple. It’s those things. So, you need to observe how your customer uses your product to do those things.

You need to observe how your customer uses data to make better decisions, or to optimize a business process, or to mitigate business risk. You need to know how your customers operate at a very, very intimate level, arguably, as well as they know how their business processes operate. So, starting with that as a core assumption, if you don’t know how your customers are really using your products, if you’re just publishing a bunch of reports to a marketplace in the, kind of, data product Field of Dreams in the hopes that they will come and use the stuff, and you’re not even tracking anything like usage. Or maybe that’s all you’re tracking is usage, which is kind of a data mesh approach which says your usage a proxy for value. It’s not a great proxy, but at least it’s better than nothing.

But step one would be understanding your customer needs, understanding their problems, understanding how they’re actually using your data. That’s step one, to starting to figure out how to measure for value. We can get into a lot of different deep discussions about how to do this, but inherently, it’s a modeling exercise—and you actually touched on this—which is oddly difficult for a lot of people, particularly in the data management space to kind of get their head around because what you described is a probabilistic band. It’s somewhere between 70 to 90%. It’s somewhere between this range and this range, and we can’t be one hundred percent precise, but we can get pretty close, right? It’s a modeling exercise. And that is the world of AI. That’s the world that we’re moving towards.

The world that a lot of data management people seem to want to live in, and exist in, and make decisions through, is this deterministic yes or no. Is it good quality or is it bad quality? Can you measure it or can you not measure it? And if you can’t measure it with a hundred percent precision, then what are we trying to do? But a lot of people seem to look at the world through that type of lens, through this deterministic lens.

And I think there’s broader discussions to be had here about why people are attracted to this kind of all or nothing, left or right, red or blue, yes or no type view of the world, but that’s not the world we live in. It’s just not the world we live in. And I think data and analytics people need to be very careful in how they position this inability to measure value because basically what you’re saying is you cannot measure the value that you bring to the organization. It is literally that personal. If you are out there banging a drum that says it’s impossible to measure value, you are, in essence, saying I cannot value my own role, and I think that is a very, very shaky foundation to stand on.

Brian: Yeah. Well, I’m going to rewind a little bit here. You talked about really understanding your customer needs and how they’re using your product, so let’s go even earlier, which is, we don’t have a product yet. I personally think probably the number one tool that teams could leverage that’s basically free, is routine customer exposure time—

Malcolm: Yes.

Brian: —like, actually spending time with the people you’re building stuff for, not writing code, doing modeling, touching data, or anything, but simply shadowing them, observing when they—you know, if it’s accountants—probably do stuff on some kind of repeated basis, so getting a chance to spend time with them as they go through whatever their process is. That’s how you build better stuff because you get to develop empathy, and to know what it’s like to be Steve, not like a persona of the accounting department, but actually Steve in accounting, or whoever, a real person. But teams don’t—either what I hear is, “I can’t get access to those people because there’s gatekeepers, and they’re telling me ‘no, I don’t have time to give you access to Steve because Steve’s running the numbers,’ or ‘Jane’s doing this. Jane doesn’t have time to talk to you all. I know what they need.’”

So, either there’s no time to do it, or there’s gatekeepers in the way of it. And the org just gets in its own way. Like, we don’t need these data people going out and doing that. Like, you’re a service team; provide me service. Like [laugh], do you hear the same thing? And do you have any suggestions, as someone in product, like, you have to get access to users to build products successfully. What do you suggest there on that if teams are struggling with that?

Malcolm: I would hear this all the time as a product leader. My product managers would come back and say, “I can’t get anybody to meet with me to define the requirements,” right? “I can’t get anybody to invest the time to help walk me through their business processes. Nobody’s going to let me stand over their shoulder to, you know, let me watch them interact with our proposed interface.” I used to hear that all the time, and what I would say back [laugh] to my team members is, “Try again. And then try again. And be persistent.”

But the best way that you can go about that is to come with a proposed solution. Come with something. Come with a proposed solution to a problem and say, “Do you think that—hey, listen, I know you’re really, really busy, but if we change this to do something different, do you think that could make a real difference to you?” Or, “Do you think if we added this field to this page, do you think that would make a real difference to you?” Or, “If we changed our reports, do you think that would make your job easier?” So, be a problem solver. Be collaborative. Be somebody who is eager to help make your customers’ lives easier?

You hear no, when people think that you’re a burden. You start to hear more yeses when people think that you are actually invested in helping make their lives easier. And I know that sounds kind of nuance-y and touchy-feely, but it’s the truth. And I’ve been a product manager, I’ve had people tell me, “Go away, leave me alone,” where I’ve had to go back and do some mock-ups, do some—throw together a fake dashboard in Excel, do whatever it is, you need to put something in front of somebody that says, “Hey, if we did this, do you think it would make things a lot easier?” Then you can get people engaged. Then they’re either going to say, “Yeah, that looks awesome,” or, “No, this is the dumbest thing ever,” but at least they’re engaged.

Brian: Yeah. I like that. And how do you making someone else’s life better is a good way to frame this, right, because your interest is not in your thing, but rather and their well-being. And it’s the positioning of how you approach it. I think that’s a wise approach.

Although I would say, what I hear with in the data product space—and this is—I’m kind of okay with this—I think we’re on a path of change, so even the fact that we’re talking about applying product orientation to building machine learning and analytical data products, that’s a positive thing. Yep, we want business value. And most of the time I hear that teams are taking data people, putting them into a product role, they then have to figure out well, what does that mean, exactly? Apparently, this is a better way to do it, so I’m supposed to get these skills, and then do my old job, but in this new way, called product. So, this person is now effectively go-to-market. They’re the UX person, they’re the requirements person or whatever.

Hopefully, they’re responsible for outcomes and benefits, which to me is really what a product person is about. It’s not about the thing, but it’s about the benefits the thing is supposed to deliver. But what’s missing a lot of the times for me is there’s no sense of your user experience outcomes, which is, yes, we want to save money, mitigate risk, or make money, but to do that, if there are humans in a loop, then we have people’s lives and feelings and all that stuff that you talked about. How do we make Steve’s life better in accounting, such that he will then unlock the value of the data in his reporting or whatever he’s doing an accounting, which I don’t know. But—

Malcolm: [laugh]. Accounting things.

Brian: —if we get to—accounting stuff, right—but if we can make Steve’s life better, then the business will unlock the value of the data initiative. To me, that’s missing because most of the time I can’t—when I ask someone, they cannot explain to me what a positive outcome for Steve is with this initiative. They can’t say w—it’s super laborious for Steve to do X Y and Z. Steve hates this, as does his colleagues. We’ve asked them about this. They can’t stand how much time they spend, like, aggregating the data. Before they do any accounting stuff, they spend a week putting together numbers just to get to the point where they get to do accounting. They would love to do accounting earlier [laugh] because that’s what they were hired to do, is accounting [laugh].

This is not conversation that I hear. What I hear is, “We want to save money in the accounting department,” right? As if it’s just a faceless—like, there’s not real humans in there; it’s just this thing called the accounting department that doesn’t have people in it. Maybe I’m the touchy-feeling one. I don’t know, what’s your take on this? Like, to me, the user experience question is still missing, and I don’t understand how you get the business value if you haven’t gotten past adoption. And if adoption has humans in it, like, don’t you have to solve that first [laugh]?

Malcolm: Well yeah, but this is a product, for lack of a better word, of a siloing of the data and analytics function away from the business, and we need to break that. We need to break it in a lot of different ways. One is, we need to humanize our customers again. They’re not stakeholders. They’re not the business. They’re our customers, and they have names.

They’re Steve, they’re Joe, they’re Amanda, they’re Susan, they’re whoever, right? And my job—me, Malcolm’s job—is to make Steve’s job easier. I exist to serve Steve. Like that mindset is what you’re saying, Brian, and I’m paraphrasing—is that mindset is largely non-existent, and I would agree with you. It is largely non-existent because we convinced ourselves in the data and analytics world that it’s data first. We put data on a pedestal. We develop this mindset that the data matters more, as much or maybe even more than the business processes, and that is not true. We would not exist if it were not for the business. Hard stop.

Brian: Yeah.

Malcolm: And we could have an interesting conversation about, well, you need data as fuel, it’s the gasoline to the tank. Maybe, but the business would figure it out without you, right? The business would figure it out without a dedicated data analytics function, and the idea that data is equal or more important to the business is wrong. And that’s the mindset we need to break. Where we need to go to is a service-based, customer-based mindset where we exist to serve.

And if we can get to that point—that we exist to serve—if we can get to that point, then we can start humanizing our customers again instead of just calling them ‘the business.’ When I was an analyst, I would hear this, the artifacts of this mindset over and over and over. I would hear things like, “They just don’t get it.” I’d hear things like, “You can lead a horse to water, but you just can’t make them drink.” I’d hear things like, “They just don’t care about data quality. They don’t care about how important the data is.” I’d hear it over and over and over again.

And to me, this was all evidence that, wow, we’ve lost our way. We don’t exist to serve. We exist to make data better. And news flash, nobody cares. Nobody cares because they are executing contracts, they are delivering finished goods, they’re making products, they’re doing R&D, they’re doing whatever they need to do in order to get their jobs done. And the way that we break all of this, and the way that we make the pivot is we become a service organization, and we find a way within ourselves to pivot our mindset away from data being the most important thing to the business being the most important thing.

Brian: I would largely agree with that. I think I’m not totally surprised by it because I think sometimes it feels like in the data is the truth, and this truth will correct all those wrong decisions that ye humans have been making. Like, and you should—and not that I know everything, but I know how to put the data together for you so that it can guide you and make these things. It’s like, the truth is actually over here, and you’ve been doing it wrong. Now, there are situations where it’s a threat to bring in facts and information because we already have a worldview about this. This is, like, well understood. It’s like the facts don’t convince you. It’s like, you start with the opinion, then you look for facts to back it up. I mean, I was just listening to the Nudge podcast about this—

Malcolm: Yeah.

Brian: And these are well understood things. I always wonder how those study—I’m tangenting here—

Malcolm: Confirmation bias. You’re saying confirmation bias.

Brian: Correct. And how much of that exists in business contexts where it’s like, well, the risk isn’t exactly on you because you’re an employee here. You’re not spending your money. You’re spending someone else’s money. I’m always kind of curious how much those biases exist in a business context, where you’re making decisions, not for yourself. But I do—I think some of that probably still exists, and we have to understand that.

Bringing in facts and insights might challenge somebody in a way that they’re not ready to be challenged, so like, the way we deliver that, the experience of the product, or not even just the product experience, but is this solution solving a problem that they have? Because if Steve doesn’t think he needs help with doing the accounting the better way, he’s probably not going to use it, even if the facts would support changing it. If Steve doesn’t want that, it doesn’t feel like that’s a problem, or nor does Steve’s management, they’re not going to use it. It’s doesn’t matter that you have this pure way that would be better. I don’t know. That’s my take.

Malcolm: No, no. So, we need to press on this because I think this is kind of the crux of a lot of our biggest challenges in the data and analytics space, which is this idea that through the data, we know better. And maybe sometimes we do, but the only way you’re ever going to get into a conversation where Steve would be open to that is if we develop the business expertise. If we are seen as equals from a business perspective—and I don’t mean that I need to all of a sudden go back to the accounting school; by the way I failed accounting, but I do have a business degree, it doesn’t mean I need to go back to accounting school, but it does mean that I at least need to know how Steve operates, that I’m there, that I’m over the shoulder, that I’m viewed as a consultant, that I’m viewed as a strategic business partner, that I’m viewed as a consigliere to Steve’s business. If I’m viewed that way, and I present an alternative way of working, an alternative way of thinking or approaching accounting, Steve’s probably going to go, “Oh, man. Yeah, I didn’t even see this before. I didn’t consider this.”

But if we go in finger-wavy, and say, “Hey look, right, look at all the things, look at the facts that you’re ignoring.” Steve’s going to say, “What do you know about my business? You don’t know anything about my business.” Right? Like, “You’re like the software salesman that shows up once every 365 days to get a renewal.”

This idea of, like, data-driven, I use that phrase all the time because it’s pithy and people get it, and there’s a certain amount of nobility, at least for data people, around the idea that we’re creating something that has value and worth and that we would prefer to make decisions with data. And that’s a noble pursuit, and I get it. But at the same time, I’ve never met a business person that isn’t using data day-in and day-out. And even intuition is a function of data, right? It’s a function of lived experiences that may not be on a spreadsheet, but after 30 years of business, I can tell you, I’ve got a lot of learned experiences that aren’t worth zero. They’re worth something.

And sometimes maybe I need to make a quick decision, sometimes maybe the dashboard doesn’t exist, sometimes I need to respond pretty quickly, and all intuition is not evil. Can we get data in the hands of people when they need it, in the form they need it, as quickly as they need it, all the time? Of course not. But what I see, Brian, is not a situation where I’m providing you data and you’re willfully ignoring it. That’s not what I’m seeing.

When I hear people complain about a lack of a data culture—and I would hear about it all the time; all the time—this is bared out by the data. Gartner does a CDO survey every year, and what they ask, “What are your biggest barriers to success, CDOs,” they will say, “We don’t have a data culture.” Whatever the hell that means. But what they’ll say is, “We don’t have a data culture.” And when I had conversations and have conversations with CEOs and I press, often what you’ll hear is, “Well, we make dashboards, but nobody uses them, so we don’t have a data culture. We don’t have people who care about data quality, right? We get data quality issues, so clearly, they don’t care about data, so clearly, we lack a data culture.”

“I’ve asked our business stakeholders to attend our or weekly governance meetings, and they don’t go, so clearly, we lack a data culture.” And to me, as a product person, when I hear those things, my first reaction is, “Oh, wow, they don’t want to buy what you’re selling.” [laugh]. They don’t want to buy what you’re selling. And what you’re saying is, you’re indicting the entire organization, when in reality, maybe you make a poor product.

Brian: I have the whole thing with the data literacy… [camp 00:24:09] and all of this, it’s not to say that maybe all of us except the people listening to the show could get a little bit better at data stuff, and understanding how to use numbers, and that’s all great, but if the answer to your problem is that the problem is everybody else [laugh] that’s my issue with the data literacy. It’s like, “Well, if only everyone else would get training on this thing so that they can understand what we’re doing, then all this value would be unlocked.” And so, I’m totally with you there. They’re not supposed to be thinking about data governance, or master data management, or—

Malcolm: Exactly.

Brian: That’s not what you want your accountants thinking about when they’re doing their work. That’s supposed to be, like, oxygen. Like, Slack just works in the company. We don’t think about, like, oh, is it secure right now, and oh, is it working? Is it fast? Like, we take this stuff for granted. And I hate to say it, I think a lot of this data stuff should kind of feel invisible in that way, too. It’s like this invisible ally that you’re not thinking about the dashboard; you just access the information as part of your natural workflow when you need insights on making a decision, or a status check that you’re on track with whatever your goal was.

You’re not really going out of mode. It’s like now I’m talking, I’m thinking about the data team, and I’m thinking about—like, no, like, you’re taking that person out of their flow state and their job, whatever it is that they’re doing, the customer, you don’t want to be removing—you want to actually, you want to keep them in their flow state as much as possible. I’ve got—I guess, I’m soap-boxing here a little bit. Apologies, but [laugh]—

Malcolm: No, no, no. And this is a box I will happily stand on with you. When it comes to data literacy, my rancor here has tempered slightly because about six months ago, maybe even a year ago, I was pretty adamant and being out there. The only voice that I know of that was out there, saying—my exact words were, “The concept of data literacy is toxic.” I was using that word: toxic.

And my perspectives have tempered a little bit. I’ve become friends with Jordan Morrow who is a wonderful human being. He calls him—well, I don’t know if it was him; I’m not sure he quotes himself or somebody quoted him—“The Godfather of data literacy.” He wrote the book Be Data Literate. He’s a wonderful human being. He’s trying hard, and he’s trying to advance the data and analytics function.

But that aside, Jordan’s book, if you read his book, as I have multiple times, I mean, the underlying premise of data literacy is that companies are not getting value, as much value as they could from data and analytics because people consuming them lack skill. Now, there’s a skills gap. The exact word in Jordan’s book is there’s a skills gap. So, when he used to work for a company called Click, he commissioned a survey, and the survey found that there was this skills gap, and if we close the skills gap, ah-ha, eureka, angels will sing, and companies will see the value in data. Again, as a product person, when I think of the idea of training—and that’s all it is sorry, folks; it’s training—[laugh] okay, when I think about training, I think of training as a go-to-market function, right?

I think of training is the thing that facilitates and enables, but it’s not the be-all and end-all. It’s one motion in a product management process. It’s one motion. It’s an important motion, don’t get me wrong. We need training where it’s appropriate, but it’s just one motion, right? I go back to ah-ha, the things you were talking about, let’s go back to user-centric design, right? Let’s go back to requirements definition. Let’s go back to problem-solving. Let’s go back to value. Like, go back to those things.

If you checked all those boxes, if you feel like you honestly, truly have checked all those boxes, like, you know your customer’s business, like, you have a UX person, you have a pricing person, value person; I call it a value engineer—if you’ve done the work, if you understand the requirements, if you’ve built a tool that is easy and intuitive, it solves the problem, and you still need a little bit of training, hey, great. Knock yourself out. But I used to talk to CDOs, all day long, where they would have all those problems—people not using the dashboards, don’t see the value in data, they’ve gone and bought their own tool, whatever it is, right, all of the symptoms of what many of us call the lack of a data culture—and I’d have those conversations, and I’d ask, “Well, what are you doing?” Answer: “We’re doing data literacy.” And then I would just facepalm, like, “Oh, my goodness, why don’t we focus on some of the other things as well that we know can help.” Because product people like me are like, okay, having a more intuitive, easier to use product that is solving a need, that sounds like a good thing to aspire to.

Brian: Amen. I mean, even though you’re talking about training as a normal function of products, and for me, as a designer, I cringe because I’m like, if we need training on this thing, we still haven’t done a good enough job. Now, I think there are exceptions to that where there might be some because it’s a paradigm shift, and how things used to be done, and people understand the value of it, but they need to introduce that. But I think, like change management, training, the more you’re relying on that, the more you haven’t done the work up front—

Malcolm: Bingo.

Brian: —to simplify it so you don’t need a—so you don’t want to rely on that because training is usually a tax. Unless the learner is heavily invested themselves in the training, you don’t want it done to them. Like, education has not done to somebody. And that—[laugh] like, “We’re all going to now take data literacy with Jordan Morrow.” And again, if there are people, teams that want to take that, and they are invested in getting better with that, amen. And it’s good to know that we have people that can provide that kind of training, but if the people learning at are not invested, and it’s being told that Steve in accounting now needs to go take a data literacy course that he doesn’t want to take, that feels like I already know what I’m doing. I’ve been doing this for 25 years, and now I have to take this course to make their life better. Like I don’t give a [laugh] shit about—

Malcolm: Right. Like, name a mandatory corporate training that you’ve ever been excited about.

Brian: Right. Or that you can remember right now.

Malcolm: Well, right. But beyond that, name another area, other than maybe HR where you’re talking about legal and regulatory requirements around a safe workplace are other things where you are legally required to train your employees about certain things, like, so that aside, name another department that requires training on methods and processes used to do accounting, or procurement, or recruiting for HR? Like, some of the job postings you write are just really impossible to act on, so we’re going to make you go through some mandatory training to how to better write job postings, right? That’s just, like, one example, right? Or we’ve heard a lot of people complaining about the types of office supplies that we’ve been purchasing on your behalf, and you don’t really understand that these are the top-tier office supplies. These are the best of the best, so you’re going to go through office supply training.

Brian: [laugh].

Malcolm: Like, at the risk of sounding a little glib, I mean, this kind of what we’re saying, right? Like, we’re going to force you to understand how the sausage is made, so you can better understand the value from the sausage.

Brian: Yeah.

Malcolm: And this gets back to your point, which I completely agree with, the need to train could arguably be viewed as a shortcoming of poor design.

Brian: It absolutely is. The whole field of user experience design largely came out of the people that used to write the training manuals, and they realized, man, if we had just gotten involved earlier, we wouldn’t need to write all this documentation trying to explain how to do these convoluted things. Gradually, those people started to move upstream closer to the beginning of the product, instead of being at the end of the production line to then go and explain how to do it. And this was one of the births of this whole human factors and user experience perspective.

Malcolm: Yeah. How many people got trained on this thing, right?

Brian: You just held up an iPhone, since we’re audio only.

Malcolm: Oh, sorry. Yeah, yeah, yeah, yeah. Sorry.

Brian: But yeah—no, that’s okay.

Malcolm: But I use that as an example in a lot of the presentations I give related to data products. And may be an extreme example, but I think an appropriate one: there are toddlers that use iPads without any training [laugh].

Brian: Right. I have one of those, and I can attest to what you’re saying [laugh].

Malcolm: [laugh]. Well, there you go. But I think I mentioned early on, I mentioned early on that I’ve been on a bit of a journey, a data product journey, and I knew where I started. If you look at, kind of, the data products on the spectrum, left-to-right spectrum, I started on the extreme right—which is, like, data products as a finished thing that deliver value, that solve a specific problem, that you know, are easy to use, like, I started way, way, way over there on the right-hand side of that spectrum—what I’ve come to learn, though, is that at least through the lens of how people think about data products and define them—at least in our world—there most certainly is a spectrum. There’s all the way right, which is where I am.

And then there’s also all the way left, which is a world that is increasingly known to me because I’ve been learning about this, but there’s a whole group of people out there that think a data product is in essence, what I would call a raw material. A data product is something that is you could arguably use the phrase ‘shift left,’ get closer to the applications, but a data product is this thing that is discoverable, and the whole purpose of it is to enable scalability of a data management function or an analytics function. That is its purpose is to deliver scalability. It is a raw material that can be used across a number of different use cases and a number of different platforms. It can be used for analytics, it can be used for ML, it can be used in an API, it can be used for in an operational use case, it can be used for anything, but the whole purpose of it is to enable scalability of a data management function.

To me, that’s not a data product. It’s literally a raw material. But a lot of people think of data products that way, as this kind of input to something that would eventually become a product, but it’s an input and it is the kind of the lowest atomic chunk of something that could be consumed by downstream systems or processes. So, a lot of people think that way.

Brian: Oh, absolutely. Full agreement [laugh].

Malcolm: Yeah. Yeah, I was recently I was in London last week, I did a presentation with a guy named Piethein Strengholt, who wrote the book Data Management at Scale—it’s an O’Reilly book, very successful book—and when he describes data products, that’s kind of where they are, over to the left. The mesh, data mesh, certainly sees data products over towards the left, although they bleed a little bit more to the right because there is, kind of, a domain-centricity there. I mean, there is the concept of meeting customer needs, and there is the concept of some form of value, so it’s a little more right than left. But there’s a whole bunch of data engineers out there to see data products as something that I would never consider a product.

Brian: I completely agree. I think the data mesh with the language it used has helped create that environment. Which is okay, there’s different ways—and I think you can apply it—I was just—Omar Khawaja commented on my—he’s sent me a Slack. He’s like, “I just disagreed with you on LinkedIn.” I’m like, “Yay.”

Malcolm: I met him this week in London for the first time.

Brian: Yeah, good guy. And we were talking about the internal data platforms. And my point is, like, well, that’s not a product. That’s something on which you then develop data products is a platform. It’s an enabling function, but it’s not a product itself, nor are the things on the platform such as this, here’s a customer data bundle that has good governance in it, it’s secure, it has the right privacy stuff in it, it has SLAs, it has all this stuff that’s been refined into it.

And it’s like this little orbs sitting on a shelf, and you can take the orb off and connect it to your thing that’s downstream. And then you have this thing that’s downstream. My problem with that is, you’re at least one hop from the person that’s going to use it. Again, this assumes that a human-in-a-loop has to do something before costs are saved, money is made, or risk is mitigated. There has to be some usage, so you’re still a hop away.

And you can still apply good product thinking and even design to how we build the platform and all of that stuff. I think that’s okay. But I’m trying to champion this idea that, especially with machine learning, and AI, and all the potential that unlocks is that the people that are in charge of building those kinds of solutions need to be connected to the business and to the users of these things, and we need to be thinking more like product people, which means the benefits creation has to be the primary thing. That is your job. So, if you want to put a product title on your hat, or a shirt, or whatever you’re wearing, benefits creation has to be there, and benefits meaning at the very end of the chain [laugh], where the rubber touches the road, where the human in the loop is going to finally interface the dashboard, application, whatever the thing is, that’s where it matters. You worked backwards from that point.

That to me is the essence of the product thing. Everything before that, it’s like great, you practiced some good stuff. But that orb sitting on the shelf is just a cost. It’s still a cost right now because it’s not been taken off the shelf and put into something that then created value. It’s just an orb. It’s a cost. There’s a bunch of money stuck in that thing right now that we want to be able to multiply somehow [laugh].

Malcolm: Yeah, and all those things, those orbs—I like that—

Brian: [laugh].

Malcolm: —taking that approach is good, but there’s not a lot of product managers who are deeply ingrained into their supply chain. They’re giving requirements to the supply chain, but somebody else in the supply chain is actually doing—you know, procuring the goods, creating the bill of materials that have the list of all the orbs, right?

Brian: Right.

Malcolm: The bill of materials becomes the raw inputs to the product. Now, are there product managers who kind of care about that, who influenced that? Most certainly. If I’m making an iPhone, right, I want the best components possible for it, I want the best orbs possible for it. And if the orbs break, and our return rates are higher, or customers are unhappy, well, then we got a problem, we need to fix it. But those materials are not products; they’re materials.

But you know, data people, a lot of us, kind of, are basically librarians, and we want to put things into classifications that are logical, and work forwards and backwards, right [laugh]? And in the product world, sometimes they just don’t, where you can have something be a product and be a material to a subsequent product. But the point you made is the key differentiator: it is value delivery to an end customer. That’s it. And if that’s not happening—end customer—and if that’s not happening, then it’s not a product, it’s material.

But it’s hard for data people to get their head around that because it doesn’t fit, like, this kind of classic ontology, like the Dewey Decimal System, or a SIC code, or NAICS code. It doesn’t kind of fit into those types of paradigms, and for a lot of data people, the idea that one thing can be two things is hard because it breaks models. What I just said breaks a data model, when one thing is actually two things.

Brian: [laugh].

Malcolm: Then you would say—a data person would say, “Well, it can’t be. It has to be two things, then.” Right? And that’s how data people think. They think in models, and they think in structures, and they think in hierarchies, and when one thing could actually be two things—this phone that I’m holding up could subsequently be a material into some other finished product, maybe a robot that was using the phone for its eyes, for example—but that idea that one thing could be two things at once kind of breaks the logic of a lot of people, breaks their brains.

Brian: [laugh]. Well, if anyone’s still listening to this episode.

Malcolm: Of course they are, man. This is electric.

Brian: [laugh]. I will say, like, coming from the design perspective—and I’ve joked about this before—so much of the conversations that I have, and just the things I hear about in the wild in the data community, you could literally just erase the word data and insert design, and people would be complaining about the same stuff, which is like, “The organization doesn’t understand design. That’s why—they don’t get why we need to go do this stuff. They’re design illiterate.” You could just literally fill in the blanks with this kind of stuff. “They don’t see the world through the same lens.”

And I think the same onus is on designers, which is, we need to stop talking about design because most people don’t care about it. I don’t talk about it too much on this podcast. I tried to use other language to talk about it, about benefits creation, and product and all this, and humans-in-the-loop, and all these things that we can understand without talking about that domain because no one’s interested. The data people and the customers, they don’t care about design. For the most part, they don’t care about that kind of stuff. It sounds cool, sounds hand wavy, sounds creative, but like, how is that going to help me, Steve in accounting—going back to Steve, right [laugh]?

My point is, when you talked about ontologies and thinking about things and structures and all of that, I think we all bring our bias about the world into the stuff, and I guess, one of the most useful things that I learned when I really worked with the best designers and the best product people was, like, this real idea of empathy, which is just, like, taking off your own hat and being able to see the world through someone else’s perspective about why they think the way they think, and just not bringing your own sand to the beach. Like, you got to leave that stuff parked at home, and just try to see it through their perspective. And being able to take that off made me such a better designer and a consultant as well because I’m not trying to push my agenda on them; I’m just trying to see it through their perspective, and then use my skills where appropriate.

And I’m hoping data people can leverage this as well because I don’t think most people think this way. And if you’re analytical, it’s hard to imagine that they couldn’t because you can—[laugh] you can, kind of like, compute why it makes more sense to think this way. Like [laugh], it’s hard not to think that way, you know? I can realize that. Not being one of those people, I can understand why data people think of the world this way because I’ve talked to so many of them and listened to them, and I try to develop their perspective, even if it’s not mine, you know, [laugh]?

Malcolm: The good news is that we can all think this way. And going back to what I said probably 30 minutes ago now, I think the key to unlocking this is a mindset shift. And I had said, kind of, taking more of a service-oriented perspective, I think that is huge. I think you said empathy. Couldn’t agree more. Another thing we need more of in the data world is a belief that our customers have positive intentions. People aren’t out there to make your job harder.

Brian: Right.

Malcolm: Right. They’re not out there to make your job harder. They’re out there to make everybody’s job easier. They’re just trying to keep up to their SLA, or their contractual commitment, or whatever they need to do in order to do their jobs day-in and day-out. That’s what people are faced with.

There’s other things that we could be doing better as well, just from a mindset perspective, right? And I would argue in many ways, when it comes to certain things, we have an overly negative mindset. I’ll give you an example. There’s a lot of people out there that are talking about—they will say, “Oh, well, 80% of machine learning models never make it to production.” For me, as a product person, I would look at that and say, “Oh, wow, that’s a lot of iterating.” Right?

That’s a lot of spaghetti being thrown at the fridge, and maybe that’s actually a good thing because a lot of data science is inherently a creative function, and you got to throw a lot of spaghetti at the fridge in order for some of it to actually stick and drive incredible transformative value. But when it comes to data science, that’s kind of the nature of the beast, right? And that’s not a bad thing. That’s a good thing. That’s suggesting a lot of rapid iteration, that’s suggesting an agility, that’s suggesting a try before we fail, right, to get the absolute best model that best solves the problem. But there’s people out there saying that as a negative thing.

Like, again, getting back to this idea of highly deterministic thinking that oh, well, 80% of data scientist’s time is wasted to data quality issues. Well, hold on. Wait a minute, is that a bad thing? You certainly seem to be suggesting that it is. When you talk to data scientists, what they’ll tell you is, “Yeah, it’s a sucky part of the job, wrangling”—I’m air quoting, “Wrangling,” right, transforming the data, getting it into a shape where it can be consumed by a Spark Job, or by some sort of ML model, right, to transform the data where it could be in some standardized format. That’s just part of the drudgery of the job.

And maybe that could be done by a data engineer, maybe that can be done by business analysts, maybe it can be done by AI, I don’t know, but the idea we position things as negative when in the case of you know, machine learning, models not actually going to production, maybe that’s a good thing. Or data scientists having to do a lot of data wrangling, well, maybe that’s just part of the job. So, there's a mindset shift here that I think we desperately need. And the key to unlocking all of them, I think, is to be a little more customer-centric, and arguably, product-centric.

Brian: The one thing I would disagree with on that was, okay, so 80% of the models fail or failed to get into production, right?

Malcolm: Yeah.

Brian: Well, did they not get into production because there was something inherently wrong with the model or the model was highly predictive, but other things got in the way? And secondly, did you take five swings at the same problem and only one of them got out? Or did you take five swings at five different problems and there’s more thrashing? So, to me, the innovation thing, like, it’s okay to try stuff and not have it work, but there is some science to how we do that, and if you’re hitting the same kind of problem, like, the user adoption piece, which is we keep building these models, and they’re great, but nobody wants to use them. And you find out well, maybe the fact that they’re not interpretable is the problem.

I can’t trust this because my job is on the line. If I do this thing wrong using that thing you made—and I know you’re telling me it’s really accurate. But the 7% of the time, it’s not, where I can’t explain it, I’m in trouble, for regulatory reasons or whatever the reason is, I’m not putting my ass on the line and taking that risk because I don’t know how you came up with that number. And that’s a different kind of problem than there’s something wrong with the data science function. So, I think it depends on what the problem is.

But I would agree with you, it shouldn’t be a hundred percent. Like, it’s good to hear we’re trying stuff that’s not working, but it’s like, are we trying multiple attempts and refinements at the same thing? Or are we kind of thrashing, like, moving from project, to project, to project, and like, I gave him what they asked for, and then I move on to the next one, and I don’t know what they did with that thing. It’s in GitHub, and I put it where it was supposed to go, and then I moved on. It’s different. Those are two different—

Malcolm: I totally agree, but if you get back to my original assertion, which is, let’s take more of a customer and product-centric focus here, if you inherently understand the customer problem, if you inherently understand the value that you’re trying to drive, I think a lot of the issues that you highlighted would naturally just dissolve away. Meaning, we want to avoid just building models as pure science experiments, right? Like we want to—this is a [dupe 00:46:49], by the way. This is big data. We are describing the demise of big data, which is there was this amazing technology, and all of a sudden, we turned a bunch of data scientists loose to build out these models that found causal relationships between two things that nobody cared about, right?

It’s like, “Hey, did we happen to know that left-handed people tend to like something a little bit more than”—you’re like, “No, wait a minute. Hold on. Nobody cares. Nobody’s asking that question.” So, a lot of time was wasted in a lot of early iterations of what we could call big data, and we’re seeing the exact same thing now with data science.

There’s a lot of CDOs who are really concerned about data scientists going off to the lab and having fun, you know, for lack of a better word: fun. But they’re doing these experiments in a vacuum, and they’re not tied to customer needs, they’re not tied to value, they’re not solving specific problems, they’re just building models for models’ sake. That’s what we need to avoid. That’s the 80 pieces of spaghetti. To your point, maybe could it go down to 60? Or 50? Or 30? Maybe. I don’t know.

My point is, that these things are inherently iterative, they’re inherently experimental, and you need to run multiple models in order to get to the point where you’re confident that is relatively predictive of the real world. So, broader point here is just more of a mindset shift. And you know, maybe these things aren’t necessarily a bad thing, but how do we become a little more product and customer-driven so that we avoid situations where everybody thinks what we’re doing is a time waster?

Brian: Yeah. Some closing thoughts here. I want to give you the last word, but do you think this, I’ll call it, our perspective on this, as we tended to agree on a lot of this here, is this still a heavy minority viewpoint?

Malcolm: Absolutely.

Brian: Is any of this changing? Or is it mostly, like, there’s just camps and people are in the camps? I mean, I don’t really care about my viewpoint. It’s more like I just your work to matter, which I think when your work ships, and it has impact, it starts to feel better. That’s good for the business, it’s good for Steve in accounting, and it’s good for you the maker of the things because now your things matter. I think most people want that. I don’t know.

And to me this product and design-driven ways of doing it help all three of those touchpoints. But I don’t know if that’s changing at all. I mean, we have that Data Product Leadership Community, there’s evidence to me that there are people out there doing this. You are not alone, if you’re listening to this show—

Malcolm: No, you aren’t.

Brian: I’m kind of curious your take, you know, outside of—you’re a lot—with Profisee and stuff, I know you go to a lot of conferences, and you probably have more exposure than I do. I’m just kind of curious your take. Is this changing at all?

Malcolm: It’s changing, but it’s slow because it’s relatively unknown, and it’s a competency—‘it’ meaning product management—is a unique competency, frankly, that most data people just don’t have, and a lot of them wrongly assume that is something that you can train your way into. Can you train to become a great product manager? Yeah, but there’s a lot of really great product managers out there. Can you train to be a product leader? Yeah, but there’s a lot of great product leaders who are already out there.

So, I think we’ve got a lot of work ahead. I’m hearing positive things. I was just in London last week, and I had a lot of people walk up to me and say, “Hey, I appreciate a lot of the things you say related to data products. You’ve got some fresh perspectives here. They seem to make sense to me, and we are investigating hiring a value engineer. We’re investigating hiring a product manager. We are interested to learn more. We want to learn more. We see the promise here.” But I think we’re in very, very early days, and people like you and me need to keep the pressure on. Because there’s goodness here. I know there’s goodness here. The same-old, same-old is not moving the needle, and we need to move the needle.

Brian: Yeah. Amen to that. Where can people talk to you, follow up with you? Also, what the heck is Profisee, for people that don’t know? What is that?

Malcolm: [laugh].

Brian: I know we didn’t really talk about that, but I wanted to give you a chance to say something about what Profisee is because it’s not really a data product thing. It’s a little—it’s kind of ancillary to your—or orthogonal to your work there, but just to give you the last word on that.

Malcolm: It is and it isn’t. So, Profisee does Master Data Management, MDM. So, Profisee solves the single version of the truth problem. If you’ve got data silos around your organization with Joe Smith, Joseph Smith, J.S. Smith, J.N. Smith, and you don’t know if that’s one person or four people, that’s the problem that Profisee solves.

And when we solve that problem, arguably, what we create is something we’re calling a Consumable Master Data Product. It’s governed, it’s discoverable, it’s curated, it’s accurate, it’s trustworthy, and it is something that could drive value for your organization. So, that’s what Profisee does. You can find me on LinkedIn. There’s only three Malcolm Hawkers on the planet. I’m one of them. If you spell my first name correctly, but if you search for Hawker, H-A-W-K-E-R, and you’re in the data world, chances are pretty good I’ll come up high on the list.

I would love to connect with you. I’m sharing what I know every day on LinkedIn. I’m a rabid poster. I’m sharing information on YouTube, I’m sharing it through a podcast, CDO Matters, just like this amazing podcast I’m on today. So, if there’s another you’d want to add to your subscription list, CTO Matters. But would love to connect with you on LinkedIn. That’s the best way to reach me.

Brian: If you’re a rabid poster, does that mean you have rabies?

Malcolm: No. No, it’s a metaphor.

Brian: Oh okay.

Malcolm: You know, it’s my passion, my fervor.

Brian: So, they won’t get rabies if they follow you, is what you’re saying.

Malcolm: Yeah. Well no, and if they follow me, I promise you, I’m not in the business of monetizing your community. I’m in the business of helping provide insights and share best practices what I’ve learned over 30 years in business, so—

Brian: Yeah, yeah. No, I enjoy your posts, and I think for this audience that’s listening, if you’re looking for more perspectives on this from someone who has actually been a CPO, a Chief Product Officer, and done this, and now is in a data strategy role, I think there’s a lot to connect—you can connect those two worlds together, and maybe use language that… I can’t. So, go check out Malcolm Hawker [laugh] on LinkedIn.

Malcolm: Thank you. Much appreciated, sir.

Brian: Well, it’s great to talk to you, Malcolm, and maybe I’ll see you this summer. Are you going to CDOIQ this summer?

Malcolm: I’ll see you in Boston.

Brian: Oh, cool.

Malcolm: Yep, I’ll be there.

Brian: I’m giving a talk there, so I’m looking forward to catching up, and let’s grab a drink or something, too.

Malcolm: Sounds great. Thanks so much, Brian.

Brian: Take care.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe for Podcast Updates

Join my DFA Insights mailing list to get weekly insights on creating human-centered data products, special offers on my training courses and seminars, and one-page briefs about each new episode of #ExperiencingData.