121 – How Sainsbury’s Head of Data Products for Analytics and ML Designs for User Adoption with Peter Everill

Experiencing Data with Brian O'Neill (Designing for Analytics)
Experiencing Data with Brian T. O'Neill
121 - How Sainsbury’s Head of Data Products for Analytics and ML Designs for User Adoption with Peter Everill
Loading
/

Today I’m chatting with Peter Everill, who is the Head of Data Products for Analytics and ML Designs at the UK grocery brand, Sainsbury’s. Peter is also a founding member of the Data Product Leadership Community. Peter shares insights on why his team spends so much time conducting discovery work with users, and how that leads to higher adoption and in turn, business value. Peter also gives us his in-depth definition of a data product, including the three components of a data product and the four types of data products he’s encountered. He also shares the 8-step product management methodology that his team uses to develop data products that truly deliver value to end users. Pete also shares the #1 resource he would invest in right now to make things better for his team and their work.

Highlights/ Skip to:

  • I introduce Peter, who I met through the Data Product Leadership Community (00:37)
  • What the data team structure at Sainsbury’s looks like and how Peter wound up working there (01:54)
  • Peter shares the 8-step product management methodology that has been developed by his team and where in that process he spends most of his time (04:54)
  • How involved the users are in Peter’s process when it comes to developing data products (06:13)
  • How Peter was able to ensure that enough time is taken on discovery throughout the design process (10:03)
  • Who on Peter’s team is doing the core user research for product development (14:52)
  • Peter shares the three things that he feels make data product teams successful (17:09)
  • How Peter defines a data product, including the three components of a data product and the four types of data products (18:34)
  • Peter and I discuss the importance of spending time in discovery (24:25)
  • Peter explains why he measures reach and impact as metrics of success when looking at implementation (26:18)
  • How Peter solves for the gap when handing off a product to the end users to implement and adopt (29:20)
  • How Peter hires for data product management roles and what he looks for in a candidate (33:31)
  • Peter talks about what roles or skills he’d be looking for if he was to add a new person to his team (37:26)

Quotes from Today’s Episode

  • “I’m a big believer that the majority of analytics in its simplest form is improving business processes and decisions. A big part of our discovery work is that we align to business areas, business divisions, or business processes, and we spend time in that discovery space actually mapping the business process. What is the goal of this process? Ultimately, how does it support the P&L?” Peter Everill (12:29)

  • “There’s three things that are successful for any organization that will make this work and make it stick. The first is defining what you mean by a data product. The second is the role of a data product manager in the organization and really being clear what it is that they do and what they don’t do. … And the third thing is their methodology, from discovery through to delivery. The more work you put upfront defining those and getting everyone trained and clear on that, I think the quicker you’ll get to an organization that’s really clear about what it’s delivering, how it delivers, and who does what.” – Peter Everill (17:31)

  • “The important way that data and analytics can help an organization firstly is, understanding how that organization is performing. And essentially, performance is how well processes and decisions within the organization are being executed, and the impact that has on the P&L.” – Peter Everill (20:24)

  • “The great majority of organizations don’t allocate that percentage [20-25%] of time to discovery; they are jumping straight into solution. And also, this is where organizations typically then actually just migrate what already exists from, maybe, legacy service into a shiny new cloud platform, which might be good from a defensive data strategy point of view, but doesn’t offer new net value—apart from speed, security and et cetera of the cloud. Ultimately, this is why analytics organizations aren’t generally delivering value to organizations.” – Peter Everill (25:37)

  • “The only time that value is delivered, is from a user taking action. So, the two metrics that we really focus on with all four data products [are] reach [and impact].” – Peter Everill (27:44)

  • “In terms of benefits realization, that is owned by the business unit. Because ultimately, you’re asking them to take the action. And if they do, it’s their part of the P&L that’s improving because they own the business, they own the performance. So, you really need to get them engaged on the release, and for them to have the superusers, the champions of the product, and be driving voice of the release just as much as the product team.” – Peter Everill (30:30)

  • On hiring DPMs: “Are [candidates] showing the aptitude, do they understand what the role is, rather than the experience? I think data and analytics and machine learning product management is a relatively new role. You can’t go on LinkedIn necessarily, and be exhausted with a number of candidates that have got years and years of data and analytics product management.” – Peter Everill (36:40)

Links

Transcript

Brian: Welcome back to Experiencing Data. This is Brian T. O’Neill. Today I have Peter Everill on the line from Sainsbury out of the UK London, yes?

Peter: Yes, that’s right, Brian.

Brian: Excellent.

Peter: It’s Friday afternoon. The weather’s great. Looking forward to spending an hour with you talking about data product management.

Brian: So, great means gray skies but not raining.

Peter: [laugh].

Brian: For London, is that considered great weather [laugh]?

Peter: It is great weather, so this is fantastic weather.

Brian: Oh, okay. Nice, excellent. You’re head of data product analytics, machine learning. You’re doing real data product work over there, and you’ve been doing this for a while. We met through the data product leadership community, which is going to be launching. I’m really happy to have you as one of those founding members in this as well.

And you had said that you’ve been doing this work for a while and just through our talks and stuff I thought you probably have some nuggets to share from your experience. So, just for people who don’t know what Sainsbury’s is, maybe you can quickly explain what Sainsbury’s the brand is and start there, and then why are there data products involved? What are you doing? What does that mean? Tell us a little bit about your background.

Peter: Sainsbury’s, for those who don’t know, is a supermarket in the UK. It’s one of the largest supermarkets. I think of Walmart. Yes, I’ve been working at Sainsbury’s for almost six years now in the data product division, building data products.

Brian: Did you emerge into that role? Or you walked into a role that was called data products? I’m kind of curious, did these things grow organically or there was a leadership in place that very specifically chose that language because it means something? Can you tell us a little bit about that history there?

Peter: I’ve been doing data product management for almost a decade now, but it’s only since working at Sainsbury’s that I’ve actually had the data product manager title. Before that, it was working in consultancy where, went to organizations and was effectively building data products and platforms, as a consultant. I got tired of the consultancy lifestyle but absolutely loved the work that we’re doing. At that time in the UK, around 2017, 2018 is when large organizations in the UK were starting to build their own internal data and analytics, sort of, center of excellence. And so, I thought that was a great opportunity to carry on doing the work that I loved, but internally for a company rather than being a permanent visitor.

So, during Sainsbury’s—and actually, it’s quite early on in my time at Sainsbury’s—that the leadership were ahead of the game really, in terms of bringing product management thinking to a data and analytics organization. And it’s really where my skill set lies in product management, so absolutely natural fit for the role. We’ve been growing that role and that competency for the last five years now.

Brian: Are there multiple people that share your role there or are you kind of the primary one? Or, tell me a little bit about the makeup and maybe your management. Is there an overall data product leader above you as well, or how does that look?

Peter: We work [at scale 00:03:29] at Sainsbury’s, possibly a unique situation, due to the scale that we work up and how much we have to sort of divide and conquer in the data world. So, we have a chief data product officer working in the organization, and then, within the—

Brian: That’s one title—sorry to interrupt—that’s one title?

Peter: One title, yeah.

Brian: Chief data and product—wow, I’ve never heard those two things together. That’s fascinating.

Peter: And then we divide, essentially, the data organization into two. So, first of all is data and building out the data platform and the architecture, and effectively the data architecture within it. And the second part is analytics. And obviously those work very closely together, data being a key input into all the analytics that the organization does. I sit within the analytics organization as the head of data product for analytics, responsible for the use cases of analytics.

So, more of an offensive data strategy, where what are the problems that the business has got, what are the use cases that will drive financial performance for the business, and defining those with the stakeholders, defining the analytical product, and then working with my colleagues’ respective head of data products in the data part of the organization for them to build the data platform that drives those analytical products?

Brian: Got it. They’re almost kind of like thinking of them as almost like the engineering or enablement team for you to be able to deliver on the front-end or the human-facing aspect of whatever the analytics delivery will be, dashboards or software application or whatever. Is that more or less correct?

Peter: Absolutely. And we’ve got an end-to-end product management methodology that’s eight steps that takes us from problem statement, business problem, user problem statement, through that analytical product discovery, into the data build, product discovery, through data build, through analytical build, and release adoption, and impact into the business.

Brian: Where do you spend most of your time in the eight steps [laugh]?

Peter: So, the first two steps—three steps if we count zero—zero, one, and two are the discovery steps. I would say a great part of my role is in zero to two: the product discovery, understanding, educating where analytics can add value, and solve problems, and shaping that into clear analytical products that are designed. And then at the end of the assembly line, as we call it, at steps seven and eight, the analytical build, validating that that’s been built, release, and adoption to the users, and then circling back around [through all 00:06:09] iterative to build on top of that.

Brian: If you’re really interfacing heavily with the end-users of the solutions and the stakeholders—and I’m assuming frequently that’s the same person; I don’t know if that always is, necessarily—but how much are they involved with those eight steps? Are they kind of just in the zero to two and then maybe a little bit at the end before it goes live or are they are being dragged along the entire time through this process? To some level of degree; obviously, they’re not watching people sitting write Python code, but can you talk to me a little bit about how you involve them?

Peter: Yeah, absolutely. Big believer that we’re building products for users. Those products, our users, very aware that our users are aware of their problems, really engage with the users around what problems they have in doing their job and their role in the organization, but they’re not paid and they’re not experts in how their problems can be shaped into analytical problems—products. So, that’s where I spend a great deal of time validating the user stories, the design, and making sure that what we build is valuable to them, but also viable; it’s something they’ll use if we build it.

And then don’t involve them in their build process, certainly not for the data build process, but the power that we’ve got [unintelligible 00:07:20] our eight steps methodology is that they know where things are in that lifecycle, so it’s not disappearing into a black hole for them. We get into analytical builds, post the data build, we do lots of user acceptance testing with them, key stakeholders of that group, and then we will do a lot of work with them around the change, the comms, the training, the release management.

Brian: You mentioned the word design in here. So, are you designing at the same time, where you’re doing—where you say, “Analytical build,” is this also where the design of the solution is occurring or did that happen earlier in the process such that it’s straight execution work by the time it gets to analytical build?

Peter: In the design stages, steps one and two, we’re looking at both the analytical design and the data design. The analytical design is looking at problem that it’s going to solve and its appropriateness to solve that problem. We’re looking at three things: the consumption of the design of the product, and testing that if we build it, someone’s going to use it. So, we do elements of, obviously, ideation workshops, design workshops, incorporating the user stories out of that. But we also like to do as part of our discovery process, POCs, and prototypes.

And so, POCs might be a one-off, provide a sample, limited time, and a subgroup of the problem, and we get them to test that and see if it’s something that answers their problem. The next stage might be to do a prototype. So, for a limited time on a limited scale in a not productionized way, again, to validate that if we were to go to full production and build, it’s something that they would use and would make value of it because the production part is the expensive part. So, that’s what we sort of do in our iterative design process. Then when we’re really comfortable that we’ve validated that, then we go into the build process.

Brian: All that stuff has happened in pre-data build and pre-analytics builds, so it’s kind of in that zero to three stage?

Peter: Yeah.

Brian: So, you don’t jump into implementation, it sounds like, too early [laugh].

Peter: No, no. Lots of war room wounds from jumping into implementation and release.

Brian: [laugh].

Peter: So, yeah. Which is a really difficult mindset for the organization—and any organization—because it’s really seen—specifically in the Agile world—that it’s seen—of build something, release it as soon as possible. And people want to naturally shave time off that discovery time. We’ve just learnt that, actually, the discovery time is the most valuable part because that bit up front can save you so much time later on, so much rework, and so much cost that it’s really, it’s an incredible investment, actually.

Brian: How did you get to the place where you were able to put those brakes on and kind of enforce your process? I don’t think everybody in this space maybe has that luxury, or there’s just this idea of speed over everything, and if you’re delivering something, you can get feedback on it. But as you know, most of the time, it’s a lie. No one wants to throw out the first version; they want to keep adding to it. So really, we’re doing additive design, we’re just adding more stuff to it. We don’t ever want to throw out what didn’t work.

Peter: Yeah.

Brian: You just want to keep tweaking it, you know? And a lot of times, if it’s fundamentally just off, that’s a hard thing to swallow and the—it’s just the sunk cost bias kicks in and no one wants to own that thing that didn’t work. Was it always in the culture or was this something that had to change? “Just hold on, we’ll get to the engineering. We’re going to talk about implementation, but first, we don’t even know what good looks like yet. We have no [laugh] idea how we’ll know if we did a good job yet.” How did you get to that place?

Peter: I would say we’re still getting to that place, Brian. It’s a continual education, adoption of the methods that we want to use, and our methodology. And I think it started several years ago in terms of developing the methodology and this eight-step, and then, like I say, we’re really lucky in that we’ve got quite a product culture within the organization, anyway. We’ve got big digital and web culture because we have a lot of sales through various websites and apps that we’ve got as an organization. So, there was that starting product element in the organization.

And then we have a collective set of product managers that help because they think in this mindset way. We really spent a lot of time developing this methodology. We’re actually on our second iteration of it. Data products is a multidisciplinary team sport, including both the stakeholders, the end-users, and the business people, as well as architecture, data governance, various—all flavors of analytics, and data engineering, and data architecture. So, having a common methodology across, like, a 300-person organization in the data world, but then, obviously, the wider business organization is really important to demonstrate that our ways of working and operate in a really effective team sport.

Brian: Who does your discovery, the facilitation with users, the design work? Do you have designers doing that? Does that fall on the product people to, kind of de facto, be your designers for what’s going to later be implemented? Tell me a little bit about that, how that works.

Peter: I’m a big believer that the majority of analytics in its simplest form is improving business processes and decisions. So, analytics is about improving processes and the decisions within those processes. So, a big part of our discovery work is that we align to business areas, business divisions, or business processes, and we spend time in that discovery space actually mapping the business process. What is the goal of this process? Ultimately, how does it support the P&L?

So, what do they do in this business process that either directly impacts the customer journey, or supports the customer journey in delivering the P&L? A lot of organizations don’t have documented customer journeys and business processes, and that’s really a starting point for us because we want to understand how this organization works, the decisions that are taken. And that’s when you get into [what’s 00:13:25] mapping then the users of the process or the people in the process, and their roles and responsibilities. We have our process maps, and then we’re layering on to that two things: the metrics for that process, so how do we really understand that process is performing towards a goal? What are the metrics of success that measure each stage of that process or each stage of that customer journey, how it’s performing, and what are all the different ways that process or customer journey could go wrong, and the decisions in that customer journey or business process that are made?

So, we’ve got our metrics to understand processes of the customer journey, and we’ve got an idea of the people in that. So, what are the roles? What actions do they take? Which is really important because if we’re designing analytical products for end-users, we want to know what the end-users are doing in that process, what action they can take with it with any analytics, and how that may or may not help their job. So, we do all that and we largely do that as a product team, working with end-users, workshops, obtaining their knowledge of what they do, and how the processes work, and mapping it and documenting it all into a methodology that we’ve got. And that’s really our starting point to really understand that domain. From there, we can start to identify the problems and what we can potentially do in terms of analytics and analytical products to help support that.

Brian: Who exactly on your team does that work? This is really core user research, journey mapping, these are bread and butter type user experience work. Is this work that you do or do you have a team that does this? Who are the people that go out and learn, how do we buy oats, barley, and bulk grains? And how much do we pay? And when do we buy it? And which stores need it? And maybe someone’s job is to go buy the grains for the entire—every branch? I don’t know. How do you go learn about that? Who goes out and does that?

Peter: Yeah, so I have a team of ten senior product managers. Between us, we divide and conquer the organization. They work to do that with business stakeholders, end-users, in a way that we have to be really considerate that we optimize their use of time and not spending a lot of time doing it. So, trying to do it in a really efficient way where possible. We head of workshops, we can document what we know of the process as a starting framework, and then use the workshops to really add the detail on the missing gaps on that, really understand, this is the process, but what’s the user’s role in the process and what do people do?

Brian: So, it sounds like the people with product management titles are primarily the ones doing this kind of research work.

Peter: Absolutely. I’d love to have some researchers and designers in the team, but you know, maybe for a future org design.

Brian: I’ll give you a frank example here that I’ve heard before. “Peter, we don’t have time for that. I need a dashboard and I need my oats KPIs on it, and my barley KPIs. I already told you what I needed. We don’t have time for another meeting to talk about what we need. I just need this because Christmas is coming and everyone’s ready to bake cakes. I need my grains and I need to know how much to pay.” How do you navigate that conversation?

Peter: We do both in practical reality, Brian, so we are organization that has to function and has to perform quarterly results, half-year results, right? We continue to receive requests for support that are very much like that, and they form a part of our backlog; we execute on them. But increasingly, through the education with our stakeholders of how we work, the data products that we build, so we’ve classified the data products that we build and we do a lot of communication about what they are and how they can help, we try and do the two things together and say, “Okay, we’ll give you this, which is what you want right now, but how about we engage and actually rethink what we do in your area for you?”

Brian: You’re still evolving through this process it sounds like, but is there anything you’d do differently if you walked in the door today as a new hire? “I’m not going to go that route again. That took way too long.” Or, “Too many bumps.” Any kind of reflections on the journey to date, advice you might give to another team, you know, another grocery chain somewhere else trying to adopt this product mindset?

Peter: What a question. Um—

Brian: [laugh].

Peter: —so there’s three things that are successful for any organization, I think, that will make this work and make it stick. The first is defining what you mean by a data product. Like any business, if you want to sell a product, you’ve got to define what that product is and be able to market it. First thing is defining what you mean by data products. The second thing is the role of a data product manager in the organization and really being clear what it is that they do and what they don’t do versus business stakeholders, the analytics teams, the data scientists, and the engineers and architecture. That would be the second thing.

And then the third thing is that delivery methodology from discovery right through to delivery. This is the way in which we work. And I think the more work you put upfront defining those and being really clear on them and getting everyone trained and educated on that, I think the quicker you’ll get to an organization that’s really clear about what it’s delivering, how it delivers, and who does what.

Brian: So, you said data product definition of ‘it,’ the definition of the role and the methodology of doing the work of data product management. Do you have definition that you use for data product, just to help me be ground this conversation? I know that’s sometimes a vague [laugh] thing, but is there a framing that you use that helps you when you explain this to somebody?

Peter: There’s four classified types of product that we build. It’s always good to start with this because I know there’s lots of views out there in the world. And a caveat that I don’t think my view is necessarily the right view. It’s a view, but it’s something that actually through listening to other people on your podcasts and certainly yourself, I think there’s a common view of this forming in maybe different nuanced language, but the same thing. Before I describe the four types of product, describe the three components that sit in each product. The three components of a data product are: there’s a consumption through a device and a user interface is the first one. So ultimately, you’re building something for an end-user to take an action. Slight nuance: that might be a machine or it might be human, but there’s certainly an end-user and they take an action.

Then the middle bit is the maths. So, that can be simple arithmetic and metric calculations right through to AI. And then the third component is data and technology. The data layer, data warehouse, data models, and obviously all the technology that supports that. A product isn’t a product for me until it has those three components. And those three components individually do not make a product.

Brian: Talk to me about this nuance of you said, the user could be a machine or a person. What was the nuance there?

Peter: The first product that we talk about is performance diagnostic. The important way that data and analytics can help an organization firstly is, understanding how that organization is performing. And essentially, performance is how well processes and decisions within the organization are being executed, and the impact that has on the P&L. So, that’s where we start from. And performance diagnostic is essentially where we go and understand the processes within the business, the end-to-end processes, and the individual processes, and basically build out performance understanding of that.

And we build out role-based dashboards and give it to consumers via typically a desktop app or a mobile app of how the organization is performing. I think the key thing that we do in that is design from end-to-end processes. So, we’re not thinking about performance in silos so we can truly understand, actually, where is the biggest opportunity performance improvement across the entire process or journey, but then we break that down into a role-based view. The trouble with a lot of organizations is they don’t design then that view from a user perspective. The user experience in most organizations about understanding performance is they’ve got a number of places to go, they have a huge number of reports to look through, and then it becomes really difficult to understand, okay, in what report do I look in first to understand what I need to be doing and what the highest issue is for me?

And that stops a lot of action being taken because working through all those variety of reports is daunting for the user. We really want to target the principle—design principle we’ve had—we’ve got—is that any person in the organization has a single dashboard to understand performance, and that dashboard only contains the metrics and insights they need that is relevant for their job, I.e., the metrics that they can actually take an action on as part of their role. That’s the first product we build.

The second product we build is a bit of an extension on that. It’s intended to be less frequently used as a sort of an exception, but the second product is actionable alerts, otherwise known as notifications, right? And the value proposition with this product is that a generally a CEO will want people in their organization spending as little time as possible looking at dashboards because that’s typically not what someone’s paid to do, right? We really want to major on that by implementing guardrails to metrics and basically understanding when guardrails, when performance deteriorates out of a defined tolerance, then it triggers alerts. And so, the user can be spend more time getting on with their day-to-day job knowing that if performance really deteriorates, they’ll be alerted to it; they don’t need to wait until they’ve checked the dashboard the next day or the next week.

And that alerting is really important because obviously, the sooner you can either predict the deterioration in performance or the deterioration in performance has happened, you can alert someone to it, the sooner that they can take action, and the sooner that you can sort of prevent further damage to the performance of the business and ultimately the P&L.

Three and four—back to your question—are closely coupled, which is automated recommendations—otherwise known as human-in-the-loop—where you’re using some maths and algorithm, ML/AI, to make a recommendation on what the decision should be, and therefore, the subsequent action. And then you include, you serve that up to a human to consume and they can either take the recommended action that the machines made, or they can apply their own judgment and defer to another action. And then obviously human-out-of-the-loop is that the machine makes a decision. Actually, it doesn’t get served up to someone to consume and review; the action will automatically then get taken.

Brian: I love the focus on the humans-in-a-loop, understanding how people do their work such that you can serve them up the relevant stuff that they care about, right? Because people are self-interested. No one wants analytics. It takes a long time though to, like, figure that stuff out. I mean, you must spend a lot of time with stakeholders to understand, like, what’s it like to be a, I don’t know, what sourcing person, or what’s it like to be the logistics manager for all the trucks that deliver the goods. Or whatever. I’m just making up roles in my head. The picture you’re painting to me says you spend a lot of time with the business.

Peter: Yeah, the majority of our time, right? And then interfacing into the data teams to align them to what we’re doing. But ultimately, Brian, this is a problem that analytics and data organizations have: they don’t allocate this time for discovery. We talk about the lifecycle of an end-to-end data analytics project or product, those stages from zero to eight, or whatever they might be in any organization, you’re really looking at 20 to 25% of the data organization’s time actually should be spent in that discovery. 20 to 25% in understanding the business, framing the right problem, understanding their needs.

And I would say the great majority of organizations don’t allocate that percentage of time to discovery; they are jumping straight into solution. And also, this is where organizations typically then actually just migrate what already exists from, maybe, legacy service into a shiny new cloud platform, which might be good from a defensive data strategy point of view, but doesn’t offer new net value—apart from speed and security and et cetera—of the cloud. And I guess ultimately, this is why an analytics organizations aren’t generally delivering value to organizations, and that’s why we hear so much noise about that.

Brian: I hear so much about just these giant platform builds and I’m like, “You do realize that that’s just a cost?” To, like, a business person, all you’re doing is spending money and it’s just this relentless focus on the platform and it’s theoretically going to enable all this future goodness. That’s not a product. That is an enabling technology to make a product. You still have to make something that someone wants or is willing to use that could theoretically then generate value.

And there’s so less talk about that to me. I don’t—[laugh] I don’t know, maybe I guess we’re just echoing each other here, but I do hear just a lot of talk about platform implementation. That matters, right, but it is enabling. If you want to do the product way, you need to be thinking about what’s the value to everybody else, not the way the engine works under the hood. What’s the value that we’re going to deliver to them that they care about, from their perspective? And most people don’t care about the plumbing, you know—

Peter: Yes.

Brian: —under the covers.

Peter: Yeah, absolutely. I mean, we take those three components, right, that consumption of a device, so if someone takes—a user takes an action, the analytics, data science, the maths, and then the data and technology—the maths and the data and technology, those two components are pure costs. Huge enablers, and actually are responsible for the growth of data products because they’re enabling us to do data products where previously data products weren’t possible. So, hugely valuable, but they are just too cost lines. The only time that value is delivered, is from a user taking action.

Brian: Amen. [laugh].

Peter: So, the two metrics that we really focus on with all four data products is reach. So, what’s the target reach of this product? And that can be a human looking at it? How many people are we expecting to use this dashboard or this alert and what sort of frequency? And then post-release, we can monitor, is that reach happening? Are they looking at it the way we want?

What’s the impact, right? So, reach can also be decisions. So, if we’re doing an algorithm, how many decisions get made? And then also, so therefore, how many decisions do we expect the reach of our algorithm to be? And then certainly, the impact is the action, which is very much the link now for algorithms on impact should be absolutely clear. The algorithm is making a recommendation or taking an action that you can directly relate to the P&L.

With performance diagnostic, it’s a bit more difficult because you’re relying on the user to take the action. But if you monetize your metrics of performance and make that really transparent and invisible, you can benchmark when you release your performance diagnostic, this is the monetization and the failure at the point of release. And if a lot of people use our dashboard and take action the way we expect them to, we should see the value of that failure dropping as people use off insights and take the action that we expect them to take.

Brian: I want to relay a question that I got. I just did a talk with the conference board’s data and analytics executive council recently about 20 leaders from mostly large enterprises, and someone had mentioned one of their biggest challenges to adoption is that they don’t own the final delivery and I’ll call it operationalization. You probably know I don’t like this word because it sounds like something we do after we make the thing, and it’s not part of the making of the thing, which is what I think it should be. Is your team in the loop, like, end-to-end and you are able to push through? When it’s, we’re buying carrots a new way.

We’re not doing it the old way. The team, including the carrot buyers, want to do it this way, and here’s the tool set that we are going to help them make better buying decisions when it’s time to buy produce for the stores and getting that out into production there, is that hard for you or are you in the loop the whole way? And how would you answer someone that says we have, our biggest challenge is we don’t own the final delivery, the training, the opera—it’s kind of up to someone else and they can stop our work from effectively getting out there.

Peter: You know, I can talk about this prior to my current role, any point in the last ten years, and what I’m discussing is stuff learned over the last ten years, right, in terms of benefits realization, that is owned by the business unit and/or—and the divisional line, right? Because ultimately, you’re asking them to take the action. And if they do, it’s their part of the P&L that’s improving, right, because they own the business, they own the performance. So, you really need to get them engaged on the release, and for them to have the superusers the champions of the product, and be driving voice of the release just as much as the product team.

Because people are more likely to adopt a product if it’s been designed and created by one of their own. That’s a really important part of the release management at our step eight. But actually, during product discovery, one of our checks and balances is, we’ve got user engagement, product discovery and if we don’t feel like that we’re building something that has that business unit, that user buy-in from the start, we won’t go past product discovery, we won’t go into build because you’re setting yourself up for failure if you don’t have that user buy-in and adoption. And that’s painful lessons learned in the past and something we’re keen not to repeat if we can help it.

Brian: Is there a signal or a measurement you’re looking for to know, you know what, they’re not going to use this? They said they wanted it, but they’re actually not going to use it. And we’re saying pa—we’re going to hit the pause button, now. How do you know when that’s going to be the case? Is it like, “Well, I’m too busy to—I can’t come to that meeting to talk about the new dashboard,” or whatever? Is that kind of what you’re listening for? Which means oh, I guess it’s not that important? How do you know that it needs to be paused?

Peter: Buy-in of people’s time, and the use of POCs and prototypes because we can measure adoption of POCs, we can measure adoption of prototypes. And if they’re not using it at that stage, then there’s nothing to say that they’re going to use it in production, so you need to stop.

Brian: First of all, there’s political part of being a product manager and relationship-building and stuff that goes into this. That’s kind of part of it, too. But the other part is this idea that you get the users to want it, that you don’t design it for them, you design it with them so that by the time you’re towards the end, it’s like, they’re itching to get this thing out. They’re the ones that want to get it out; you’re just there to kind of help it along. But ultimately, it’s their thing. It’s for them and the champion has been created.

You don’t pull the black drape off in production and say, “Here it is. For the first time you get to see this new thing.” And they’re like, “What’s that?” Like tha—is that kind of what you’re saying is you kind of bring them along the entire time such that, you know, the final deployment is not a big deal because they are already in it?

Peter: Yeah, absolutely. Mock-ups, POCs, prototypes, by the time you reveal it, the productionized version, they’re like, “Yeah, I mean, that’s what we asked for, like, months ago.” And even with algorithms, right, they’ve got to understand how the algorithm works to trust it. And so, bringing them [in, making 00:33:17] upfront as part of the design process, making it really clear in a non-technical way how the algorithm works—you know, it does this, it takes this information, it considers these constraints—and yeah, is hugely important for adoption.

Brian: If you’ve got ten now and it’s time to hire number eleven, another product manager to own some other chapter of the business that you work on, how do you screen for product management skills in the data space? What’s the blend of technical versus non-technical skill you’re looking for? Talk to me a little bit about how to hire for this. There’s this general feeling that I hear—if you’ve listened to the show with Kyle Winterbottom, for example—businesses know that they need non-technical skill, but most of the job descriptions are still, “Yes, but can you code Python, and can you build this and that, and what’s your knowledge of platforms?” And obviously, you do need to know enough about the technology to be effective as a PM, but there’s still a heavy skew towards technical implementation skills in these job descriptions. At least that’s what I’m hearing. How do you hire for a DPM? What are you looking for?

Peter: Yeah, I think it’s a real problem. It’s a problem I’ve personally felt with my career. I’ve worked in data and analytics for 17 years. A couple of years as an analyst, but mainly in consultancy and then product. So, even with the success of that, I probably would fail getting a job in most data and analytics organization because I haven’t got strong R and Python and SQL coding skills.

It’s been a really interesting experience hiring for the team because I’ve been very open-minded to both ways. So, can we get a product person—and I always talk about will and skill because [then 00:34:52] skill can be taught if someone has the will. So, can we get a product person who’s got the will to learn the technical domain area enough to survive and build great data analytics products? Or, have we got someone from the technical world that is prepared with the will to learn the product side of it and transfer—take off their technical hat and put a product hat on? We’ve done it both ways, successfully.

I’m a big believer that it’s more on the individual and it can be done both ways. I think you’ve got to be really clear on the role. If you’re taking it from both sides, but certainly from the technical side, you’ve got to be really clear on the role, and it’s something that we do is discuss in the interview, we will give you the opportunity to do carry-on coding in your own time and keep your hand in it, but you are not expected to code and you are not expected solutionize. We really talk about simply boil it down to the what and the why, and the how and the when. So, the what and the why is the product manager; the how and the when is the technical person. And you’ve got to be really comfortable that you’re focused on the what and the why. And yeah, prepared to do the discovery, right? Prepared to think in terms of business processes, what the users are, what their pain points are, and be comfortable that that’s where you’re going to spend the greater majority of your time.

Brian: So, it sounds like you don’t necessarily screen candidates for product management experience. It’s more a data science who wants to move into that space, they show the will, it sounds like that’s an equally good candidate. Do you favor, like, someone that has some PM background from a software space or something?

Peter: PM background is good. We’ve got our own sort of in-house development and obviously, there’s plenty of external training courses that you can develop that on. It’s more have they—are they showing the aptitude, do they understand what the role is, rather than necessarily the experience? I think data and analytics and machine learning product management is a relatively new role. You can’t go on LinkedIn necessarily, and be exhausted with the number of candidates that have got years and years of data and analytics product management.

And I’m starting to laugh now, where I see job specs where it says I’ve got to have eight years of data and analytics product management. And I’m like, “Well, officially the role has only existed in the last couple years.” So—

Brian: Yeah. [laugh].

Peter: —yeah. So, I don’t know how they’ve got that. What we look at is their will to learn either side. The benefit of having a team of ten is you can people from either side, and they can co-skill each other on the different aspects.

Brian: If you’re going to hire a new person right now for a new skill set or role to add to your team to make you and your team’s life better, what would you bring in right now? What skill would you invest in?

Peter: Probably the designer aspect. That, sort of, user experience UX is something that we are exploring ourselves, but someone who’s got that themselves and can bring that sort of stronger understanding, that’s a skill that we aren’t naturally strong in as a team.

Brian: This has been such a great conversation. I wanted to give you the floor just kind of the last minutes. Anything that you want to share with our audience or any questions I should have asked you that I didn’t ask?

Peter: My main challenge is, someone I was speaking to about this podcast was saying, “Can you make sure it’s practical and there’s something in it that we could listen to and take away and implement into our what we’re doing as product managers or as quasi-product managers in day-to-day work?” So, I think the real goal of what I was trying to discuss is to discuss product management, but in a practical way. And so, my ask of, potentially, the audience is, if they listen to this and they don’t feel like I’ve given something practical to them, to respond, and for you to give me the chance to come back and give it another go.

Brian: Okay [laugh]. That sounds good. Where should they send the hate mail? No, just kidding [laugh].

Peter: [laugh]. To you, Brian [unintelligible 00:38:44] to you.

Brian: Oh, okay. So, I get to filter it all. But actually, seriously, where can people connect? Is LinkedIn the best place, or website, Twitter? Where do you hang out?

Peter: Yeah, I’m just in the process of building out a website. But for the moment, yeah, if they can go to LinkedIn, that’d be great.

Brian: Great. Excellent. We’ll definitely link up your information on there. Pete Everill, it’s been great to talk to you from Sainsbury out of the UK. We’ll obviously be talking more in the near future, but thanks for coming on the show and talking about data product management with me.

Peter: Brilliant. Cheers, Brian. Thanks, for having me.

Array
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe for Podcast Updates

Join my DFA Insights mailing list to get weekly insights on creating human-centered data products, special offers on my training courses and seminars, and one-page briefs about each new episode of #ExperiencingData.