Today I’m chatting with Osian Jones, Head of Product for the Data Platform at Stuart. Osian describes how impact and ROI can be difficult metrics to measure in a data platform, and how the team at Stuart has sought to answer this challenge. He also reveals how user experience is intrinsically linked to adoption and the technical problems that data platforms seek to solve. Throughout our conversation, Osian shares a holistic overview of what it was like to design a data platform from scratch, the lessons he’s learned along the way, and the advice he’d give to other data product managers taking on similar projects.
Highlights/ Skip to:
- Osian describes his role at Stuart (01:36)
- Brian and Osian explore the importance of creating an intentional user experience strategy (04:29)
- Osian explains how having a clear mission enables him to create parameters to measure product success (11:44)
- How Stuart developed the KPIs for their data platform (17:09)
- Osian gives his take on the pros and cons of how data departments are handled in regards to company oversight (21:23)
- Brian and Osian discuss how vital it is to listen to your end users rather than relying on analytics alone to measure adoption (26:50)
- Osian reveals how he and his team went about designing their platform (31:33)
- What Osian learned from building out the platform and what he would change if he had to tackle a data product like this all over again (36:34)
Quotes from Today’s Episode
- “Analytics has been treated very much as a technical problem, and very much so on the data platform side, which is more on the infrastructure and the tooling to enable analytics to take place. And so, viewing that purely as a technical problem left us at odds in a way, compared to [teams that had] a product leader, where the user was the focus [and] the user experience was very much driving a lot of what was roadmap.” — Osian Jones (03:15)
- “Whenever we get this question of what’s the impact? What’s the value? How does it impact our company top line? How does it impact our company OKRs? This is when we start to panic sometimes, as data platform leaders because that’s an answer that’s really challenging for us, simply because we are mostly enablers for analytics teams who are themselves enablers. It’s almost like there’s two different degrees away from the direct impact that your team can have.” — Osian Jones (12:45)
- “We have to start with a very clear mission. And our mission is to empower everyone to make the best data-driven decisions as fast as possible. And so, hidden within there, that’s a function of reducing time to insight, it’s also about maximizing trust and obviously minimizing costs.” — Osian Jones (13:48)
- “We can track [metrics like reliability, incidents, time to resolution, etc.], but also there is a perception aspect to that as well. We can’t underestimate the importance of listening to our users and qualitative data.” — Osian Jones (30:16)
- “These were questions that I felt that I naturally had to ask myself as a product manager. … Understanding who our users are, what they are trying to do with data and what is the current state of our data platform—so those were the three main things that I really wanted to get to the heart of, and connecting those three things together.” – Osian Jones (35:29)
- “The advice that I would give to anyone who is taking on the role of a leader of a data platform or a similar role is, you can easily get overwhelmed by just so many different use cases. And so, I would really encourage [leaders] to avoid that.” – Osian Jones (37:57)
- “Really look at your data platform from an end-user perspective and almost think of it as if you were to put the data platform on a supermarket shelf, what would that look like? And so, for each of the different components, how would you market that in a single one-liner in terms of what can this do for me?” – Osian Jones (39:22)
- Stuart: https://stuart.com/
- Article on IIA: https://iianalytics.com/community/blog/how-to-build-a-data-platform-as-a-product-a-retrospective
- Experiencing Data Episode 80 with Doug Hubbard: https://designingforanalytics.com/resources/episodes/080-how-to-measure-the-impact-of-data-productsand-anything-else-with-forecasting-and-measurement-expert-doug-hubbard/
- LinkedIn: https://www.linkedin.com/in/osianllwydjones/
- Medium: https://medium.com/@osianllwyd
Brian: Welcome back to Experiencing Data. This is Brian to O’Neill. Today I have Osian Jones, the head of the data platform at Stuart. You have the word product in your title as well, which excited me. I don’t see this too much; it’s still kind of new.
I wanted to have you want to dissect this article that you had written for IIA, the International Institute for Analytics, about a retrospective that you had building your current data platform as a product or thinking about it as a product, something that has humans in the loop, something that has a user experience to it. So yeah, welcome to the show. I’m excited to dig into this.
Osian: Thank you so much.
Brian: Yeah, yeah. First of all, what is Stuart? For those of us who don’t know, what is Stuart?
Osian: Sure. So, Stuart is a European tech logistics company that is responsible for the last-mile delivery across several European cities, mostly in the UK, France, and Spain.
Brian: And so, as the head of product for the data platform there, like, what does that mean exactly? Like, tell me what that means. We don’t always think of, like, the head of product for a data platform. I don’t think that’s a standard title. That’s not a—I think people are going to wonder, like, well, what exactly is your responsibility there? How is that different than just being the head of the data platform?
Osian: Sure. So, I think before we get into the product, I think it’s important to talk a little bit about the data platform. And so, what the data platform is, is essentially, we are a team that supports and enables analytics to take place across the company. Obviously, analytics, as we know, has grown and grown and keeps on growing and therefore there is a need to enable all those different teams to work in a way that’s autonomous, that adapts fast, and also in a way that the end-users trust the data as well. And I think I mentioned the word end-user there, I think that’s where we really kind of segue into product because in our view, at the end of all of our efforts in whatever we do with analytics is an end-user and therefore that means that there’s always an experience at the end of all of our efforts that we’re doing in both data platform and in analytics.
And that is really where product comes into the picture because until now—this is maybe giving a little bit of history of my career and how I got into product in the first place—analytics has been treated very much as a technical problem, and very much so on the data platform side, which is more on the infrastructure and the tooling to enable analytics to take place. And so, viewing that purely as a technical problem left us at odds in a way, compared to our friends working on more, kind of, software engineering teams where they had a product leader, where the user was the focus, the user experience was very much driving a lot of what was roadmap. And that was very much missing, not only I felt in Stuart but in the companies I’ve worked at over the years. And that’s really how I started to move into product. And it was this idea that, yes, lots of what we do are very technical problems, but at the end of everything that we do is a user who is trying to work with data. And that experience needs to be just as good as the experience that we have whenever we use our apps or websites for doing all sorts of things.
Brian: Thanks for sharing that. I’m curious, you make a good point here and I—correct me if I’m wrong and this is something I talked about on the show all the time, which is you can’t not have a user experience. Like, whether you intentionally created one or not, if someone’s going to use your service, there’s going to be an experience. So, the question is, how much intent was behind that experience? How much intent was put into the design of that experience?
Did it just emerge from a bunch of technical decisions that were made or did we actually sit down and think about it, we went out and watched what are people trying to do. How are they doing it today? How did they do it yesterday? Where’s the friction in that? How would they like it to be easier? What will empower them to be more successful? These kinds of things.
It sounds kind of obvious, but the evidence, if you look at most teams, is that this is not important. Because you don’t see staffing accordingly for these roles. You don’t see designers, you don’t see user experience professionals, you don’t see product managers, or if you do, they’re not really fulfilling the role of product management; they’re doing more project management. They’re running an Agile team and they’re running the process, which is not the same thing as running a product. So, I’m kind of curious if you have any perspectives on why this is when we consistently see these same problems with low adoption, low trust, it takes too long, all these things that have to do with the human-in-the-loop part, but we don’t resource that accordingly. Do you have any thoughts about that?
Osian: Absolutely. So, I think what’s happened in a lot of analytics teams is that we have historically, and we will still do, buy a lot of the tooling that our users need. And that makes complete sense because why would we build an entire data visualization tool, for example, from scratch? It just simply doesn’t make sense. It’s the same kind of problem that we’re trying to solve in tech logistics company as we would in a bank, as we would in a video games company, or whatever.
But what’s happened is because we buy these tools from elsewhere, we’ve kind of started to confuse user interfaces with user experiences. And so, what’s happened in a lot of analytics teams, I feel, is that we simply feel that buying a visualization and just simply adding that to our stack, it will just suddenly make all of our problems go away and that will just take care of most of the user experience by simply just providing whatever visualization tool—I mean, visualization is the obvious example; that is how the majority of users interact with the data platform, certainly within Stuart and I expect in most other companies as well. But what we’ve seen is that a user interface cannot really replace the need for a good user experience. And so, to think that in most organizations, thousands of users interacting with data on a daily basis, the idea of not actually dedicating time to investigating the needs of those users in a more proactive way, understanding their pain points, desires, and then not using that to shape our roadmap really puts us in quite a weak position, in a way, because we’re essentially just perhaps building the things in the right way, but maybe we’re not building the right thing. And that’s one of the key benefits that product thinking has brought to us in Stuart.
Brian: Mm-hm. So, it sounds like to summarize what you’re saying, there’s a perception in senior leadership that we can buy our way out of the interface problem with tools. And you’re saying, “Well, maybe you can buy the interface design components, but you can’t buy the user experience components.” [laugh]. Except that can’t—
Brian: —that to technology. Yeah, you know, it’s funny you say that. There’s some debates, I don’t know if they’re active debates, but I’ve seen different opinions about whether or not we should even talk about whether we design a user experience. There’s this argument that no, we design interfaces, and from the interfaces emerge an experience, that with the idea, ultimately, we don’t control the user experience; we can nudge it in the directions we want through our choice of interfaces and steps and how you move through a sequence and all of this, but we don’t actually literally design the experience.
So—and I like this idea a little bit only because it suggests you’re not in full control, but you can nudge it in the right directions. But if you’re not even aware of it, then you’re really just spinning the dice when we go and buy a bunch of tools and then hope that the tools will give us GUIs and then hope that the GUIs will naturally line people up in the right order of their workflow, the right way that they want to do things. It’s leaving a lot of stuff to chance, especially when these tools are, as you said, it could be the same tool that a bank uses, that a logistics company uses, that a video game company uses. I think it’s a stretch to think that you can buy yourself out of that problem. But that’s an interesting, [laugh]—I was just actually talking to a prospective client yesterday about this, how senior management, they were saying things that, like, “Oh, we have this dashboard problem and so we’re going to buy another tool that will allow us to, like, localize and serve up just the required parts of the dashboard to each person based on their profile, and that will solve our usability and our utility issues and our complaint issues.”
And it’s like, well, maybe that will solve part of it, but—again, I literally heard this yesterday, just validating what you just [laugh] said, that—and she—and so, this manager, this director, is trying to shift that a little bit, which is, like, no, we actually need some human attention put on this problem as well because what we’re doing is we’re also adding a tax. When we add more of these tools, we’re usually adding a tax, another piece of software to learn, another integration point. There’s a tax that comes with some of those benefits as well. I don’t know. That’s my take at least. Do you agree?
Osian: Absolutely. And another example, a recent example for us, would be on the data discovery side. So, this is all about being able to find and understand and locate the data that you need to kick off a piece of analysis. And had we not done the right research, perhaps we might have known that this is a bit of a problem, but it was only from doing research, and interviews, surveys with our users that this turned out to be the absolute number one problem that was affecting most of our analysts in terms of [unintelligible 00:11:01], obviously, the negative trait of not having some kind of a search mechanism in place. And so, this was the reason behind why we decided, well, we really should prioritize this because it’s adding so much time to the day-to-day life of our analysts.
And so again, that was just the first phase of identifying well, we should probably do something about data discovery. The next step is then well, what about all of the different tools and all of the different options that are out there in the market? And what features are the things that really matter to our users? And so, this is where product thinking becomes really useful.
Brian: And I want to dig into some more of, kind of, these examples and stuff about how you actually went about doing this, but going back a little bit high level here and reflecting now because it’s been a couple of months that you wrote this, but one thing I wanted to ask, have you been able to quantify the value of any of this work in some way such that you can say, “Hey, this is working well for us. We have some evidence that this is a better way to do this and it makes sense to keep doing things this way.” Do you have any—how do you track that or how do you know it’s working?
Osian: That’s the million-dollar question, right? It’s a question that we in data platform often try to avoid, I suppose, just because it’s so, so hard because at the end of the day, nobody is paying for our data platform. It’s purely not in every case, but in our case, it’s, by and large, an internal data platform. There is some data that we share with some clients, but it’s mostly for an internal audience. And so, whenever we get this question of what’s the impact? What’s the value? How does it impact our company top line? How does it impact our company OKRs?
This is when we start to panic sometimes, as data platform leaders because that’s just an answer that’s really challenging for us, simply because we are enablers, mostly enablers for analytics who are themselves enablers. It’s almost like there’s two different degrees away from the direct impact that your team can have. That said, it isn’t something that we should be shying away from and it’s actually something that’s very much on my kind of goals right now, as we speak, is to actually define this: what is the value? What is the ROI of a data platform? If, you know, we are spending on data [unintelligible 00:13:42] costs, for example, what is the value that that is giving us?
And I think we have to start with a very clear mission. And our mission is to empower everyone to make the best data-driven decisions as fast as possible. And so, hidden within there, you know, that’s a function of reducing time to insight, it’s also about maximizing trust and it’s also then about obviously, minimizing costs. And so, that two value, kind of, parameters in there, the kind of minimizing time to insight and maximizing trust is itself then a function of many other things. So, if we looked at something like we want to reduce the time that analysts spend in doing their work, we want to reduce kind of inefficiencies, on [such 00:14:35] things like the discovery tool that I mentioned, you know, where analysts used to spend many, many hours trying to find the right table, with our data discovery project, that time is reduced.
And we have some metrics from things like Slack to be able to measure the number of requests that come in from analysts to search for a particular data point and we’ve seen a reduction in those number of requests in our Slack channel over the year since the data discovery [unintelligible 00:15:06] tool was launched, which is one measure. But obviously, time to insight does not only made of that. It is also made up of the time, actual speed of data processing as well. So, those metrics are much more typical around, kind of, performance, average query run time, perceived query from the users as well, slightly different metric. And so, all of these things combine into an overall time to insights, which is one of our key metrics that we want to reduce as much as possible so that we as a business can move as fast as possible.
A whole bunch of other KPIs that we are looking into: so obviously adoption of our toolkit is really important as a data platform team because obviously, if we’re going to have any indicator of success of the choices that we make around tooling, then, of course, you know, seeing that the users actually adopt those tools is something we should probably have a look at. And then as part of that, and also going into trust, and we need to obviously measure KPIs around reliability, incidents, time to resolution, [you need 00:16:23] the percentage of datasets that have SLOs in place, these kinds of things. And so, there isn’t a short answer to this because it is a combination of various factors. But the way I see data platform ROI is, it’s a combination of how fast can analytics happen, how trustworthy it is, and measure that against the cost that we obviously spend in order to make that happen.
Brian: Mm-hm. Yeah, I mean, that’s funny because you’re like, “Oh, yeah, it makes me cringe and it’s really hard to measure.” And then you gave me, like, five minutes of amazing stuff about what you—it seems like you have a very good idea of some of what the indicators are here. And I’m curious with some of these metrics that you talked about, were these given to you? Did you arrive at these as the things to pay attention to with your superiors? Like, where did those come from as the things to track as this is what we’re keeping score with, this is what our benchmark is, this is what we want to watch over time. Where did that come from?
Osian: So, it’s something we call a KPI tree. And, as I mentioned at the beginning, was we always start in the mission: what is it that we are trying to achieve as a data platform team? And so, as enablers, you know, that mission is very clear, as I mentioned, which is to empower everyone at Stuart to make the best data-driven decisions as fast as possible. So, in there, you’ve got trust and you have speed. And both are equally as important to us.
And so, we started with the mission kind of at the top. And then going one level down, we have our three KPIs, which is speed, trust, and cost is there as well because obviously, we do need to measure that against the value KPIs as well. The level beneath that is what we call the kind of strategical levers. So, these are the things that we have the most control over. So, these are things like, what is the speed of doing analysis? What is the speed of data processing? What is the level of adoption of our tooling? How is our reliability? What are our third-party costs?
And that kind of middle layer are the things that we can make decisions on when we come to, kind of, [quotes 00:18:50] and planning and that kind of thing. But of course, beneath all of that is the lowest level of all, which is the KPIs themselves that feed into those, are we strategically this? For example, speed of data processing, two obvious KPIs there would be the average query run time in the data lake and the average query time in the data warehouse. And also, there is this, what I mentioned before, this perceived query speed time. And so, it kind of breaks down in that manner. So, we start off at the very top with a high-level mission and then we go down in, kind of, levels of detail until we hit those indicators, essentially.
Brian: Did those come from you though? I’m curious about the origin of this KPI tree and how you arrived at those as the scoring rubric. Like, did your management give this to you? Did you work with them to get it? Did you work with your team and then you said, “Hey, this is the score by which I think our team should play. Do you agree?” Tell me how it emerged.
Osian: Yeah. It was very much from within the data platform team itself. So, we’re a team of 15, mostly data engineers and two product managers, which I’m one of. And so, this was five of us. So, five of the team leads, essentially, getting together over an interactive whiteboard over a couple of sessions. And this is still in progress as I speak, by the way, so we have definitely not finished on this, by a long way.
But this is certainly something that we came up with mostly within our team. That said, the data platform is part of the wider tech organization and there is generally a drive to just be able to measure value across all the different teams of tech [unintelligible 00:20:46]. That’s actually one of the—a goal for us as a wider organization. And I’m actually very excited about it. One of the good things about data platform is that you do feel a little bit… have a little bit more freedom with this kind of thing because it isn’t so tied to specific company OKRs, just because it has a much more of a widespread impact across all of the company. And so, this kind of thing is something that we can mostly do by ourselves.
Brian: Have you gotten the sense though, that you’re—I would imagine this comes up at some point, maybe it hasn’t, I don’t want to lead the witness—but has your management responded to your metrics about how you’re tracking quality here? And do they say, “Yeah, that sounds right?” Like, looking at query load time or, you know, the other ones that you mentioned, do they know about these or these just internal? Like, at some point, like, someone has to decide, “Hmm, do I give you a million dollars for your team? Do I give you a hundred million dollars for your team? Do I give you a hundred thousand dollars for your team?”
Someone made a decision that, okay, we need this many staff to do this much kind of work, and there has to be some computation of cost versus that. And I’m so I’m curious if they’re part of that, or how are they—
Brian: —how are they in the loop, I guess?
Osian: For sure. I mean, this is something that will come up, of course, because, you know, there’s always has to be scrutiny over ROI across all of the organization. It just makes sense. We are just one of—we should be treated the same that way.
I do feel though that over the years, what’s happened is that because data is something that people feel is essential to the running of most organizations like ours, it is almost like a pro and con to that. The pro being that data platform teams, analytics teams are usually left to the this generally a sense of trust that, “Well, yes, we need data. Data is important and therefore, for the sake of business continuity, we don’t really need to touch that and they should just kind of get on with what they’re doing because it’s probably important.”
But there is a flip side to that, which is that we can just become complacent and just assume that, well as data is important, therefore, there is no need to, kind of, demonstrate the value or the ROI that we bring. And that’s the thing I really am starting to challenge now because obviously, the economic landscape is challenging for so many industries at the moment and I think it’s really important to just be able to, if we can, let’s see how far we get on with this exercise. I’m confident that we will definitely have something to show. But you know, we do need to be asking ourselves these questions. And I think I mentioned in the article, this uncomfortable question of, you know, do we actually deliver in terms of the investment that we make in our data platform in the first place?
Brian: Thank you for sharing that. I don’t think you’re alone in some of the feelings that you’re having around this, quantifying the value of the work and the state of, in your case, a platform that’s there. I guess I would just share that, if you go back to listen to my episode with Doug Hubbard, there’s this idea of measurement versus accuracy, and I think we as data people, there’s a bias towards being highly accurate. And the truth may be that we used to have an angry finance team and now we have a happy finance team.
And what we’re—our metric for the next quarter or even the next year is I want the finance team to be really happy when they’re running end-of-quarter reports. I don’t want to hear any more complaints, I don’t want them telling me that they can’t render their charts, I don’t want them telling me that it takes six hours to run a query to find out it’s wrong. If you can make that go away, the platform is doing its job. And the great thing about that, those are all qualitative things. And what it would force the team to do is go spend time with the finance department to understand what the heck is going on that this is such a problem over there.
And you may find out, you don’t need some giant changes. Maybe the changes are fairly localized; they’re local problems that we can attack with a squad and put a couple of months of effort on it or whatever it may be. So, don’t get too lost in the how can we quantify at this giant-level management. They may be looking at it at a much more basic level. And I think bringing in the quant side that data people have is helpful, too, in terms of, like, well, we should also be scored on, like, our query time or some of these other metrics that you talked about, like, time to task or whatever it may be.
I think it’s great to bring those in, but it sounds like you have a pretty good running start here and you’re looking at this as organic and evolving. It’s an evolving question as opposed to a static thing that’s never going to change and I think that’s smart, too, especially as business cycles change and the landscape is changing, as you said. Priorities may shift. What’s important for your team to deliver may change as the landscape changes. So, it sounds like you’d have a good start here.
Osian: Absolutely. And yeah, just to add to that, I think you’re right. It’s not just about these indicators and [unintelligible 00:26:08] people, it could very well get lost in the numbers and forget to take that step back. So, you mentioned something around a happy team and that’s also—part of this, obviously, is, you know, what is the satisfaction levels. You know, we talk about time of doing analysis, we talk about adoption of tooling, we talk about reliability, but it could be something else that’s actually making [unintelligible 00:26:33] unhappy. And so, absolutely, we should be careful not to get kind of drowned in all of these different KPIs. But also at the same time, it’s good to have them as, like, a basis.
Brian: Sure. I mean, if anything, it tells you where you might want to dig in further. The example I always give is like because I hear adoption is often a struggle. So, the jump is, like, let’s make sure there’s tracking code in the dashboard so that we can count adoption, and then we can see how many IP addresses or how many users were logged in and used it. And the problem with that is—or what I would say is—what if you find out the finance person absolutely hates this tool? They have regret every time they open up this dashboard or whatever the heck it is.
Or this data science needs to build a model. It’s like, “Aw, God, I hate this.” “Oop, they logged in. Check. We got adoption.” The data science team is happy. No, you measured an analytic, which is a computer-trackable event, but you’re not actually measuring an outcome, you’re not measuring an experience. So, it’s really important to not just count the numbers, but to understand, is that usage the kind of usage we want? Are they actually happy? Are they getting work done? Are they feeling empowered? All those kinds of things.
And maybe it’s like, zero is bad. Okay, let’s start with that. Yes, zero would be really bad. But pretty soon after we get to some, whatever that number is, it’s important to start going and doing that work to say, is this the right kind of adoption? Are they getting the value? And that requires us to go spend time. The analytics on the analytics are never going to tell you why people are doing what they’re doing, they’re not going to tell you about their satisfaction, so we can’t not do that work.
And if you just rely on that stuff, get ready for a big surprise. And it’s going to come out of nowhere and you’re going to feel like you’ve got side-slapped because you’re going to think everything was just fine and then you’re going to find out two really important people in the company can’t stand this, and they have clout, and they bring it up the chain, and the next thing you know, it’s coming down on your team. Not you Osian per se, but you know, just kind of talking in the generalities here. The closer you can get to those end-users, understanding how value is created, the better off your team is going to be and be successful at staying afloat, creating value, getting to work on the stuff that really matters, you know, all those good things. So.
Osian: Yeah. I mean, a hundred percent agree, I don’t think we would be doing our job as product managers if we were not listening. And I’d like to use more listening to our users and talking to our users because that’s something we really are doing on a very regular basis and it’s almost a core of being a product manager. And I think you mentioned something really interesting around asking, the power of just, you know, listening, versus just tracking some numbers. And the same thing can be said about what I mentioned earlier around reliability, even things like speed of doing analysis.
Yes, sure we can measure that maybe in some sort of quantitative ways, but really, it’s the perception of analysts’ time that we really care about because, yes, it could be that analysts are finding data much quicker because of the nice new data discovery tool that we’ve created, but if the perception is that nope, analysis still takes way longer than it should do, then that’s not good. And it’s the same for perception of things around reliability, incidents, time to resolution, these kinds of reliability metrics. We can track those, but also there is a perception aspect to that as well. So yeah, we can’t underestimate the importance of listening to our users and qualitative data.
Brian: I think that’s a very mature perspective on it. I would almost argue, the only thing that matters is the perception, like, the absolutes don’t matter because they’re only true within this abstract world, but if no one’s playing the game, by that scoring mechanism, it doesn’t matter. They’re all playing this other game and all they care about is does it feel fast? It’s like, well, what does it mean to feel fast? It doesn’t matter that it’s seven seconds or seven minutes or whatever; it’s just, how did it feel to them?
And sometimes there’s things we can do to change the experience around how it feels such that it can feel faster by designing a better experience around that. So, I think it’s really wise of you to bring that point up here, that the perception is really the only thing that actually matters here. We’re in a service. You’re, like, in a service business, right? Your whole platform is about service to others. It’s like, no one’s paying directly to us this right? It’s an internal data product, so it’s all about service. So, we have to measure it from their perspective, right? So, I think that’s super sharp point to bring up.
Brian: Going a little into the weeds about—you talk a lot about how you kind of broke down some of the tasks and workflows, you talked about some of these components that are in it. How did you learn how to do this? I’m curious, did you have designers helping you with this or user experience professionals or self-taught? Like, someone doesn’t just come up with this, what you did, at least the very first time. It looks like you had some experience with this or something. Can you talk a little bit about how did you go about designing this platform?
Osian: Yeah. I think it was mostly from experience and [read 00:32:02], maybe mistakes on the things that I’ve seen that haven’t really worked in the past in previous companies. And so, I think it was also at a very good timing because I had just started [at UX 00:32:14], it was about a year-and-a-half ago now. And I think we always need to, when you approach a problem, just start the first principles. And that’s where most of this started.
So, in starting with this idea that at the end of our effort is a user experience, so the first, kind of, natural question for me was, well, who are these users? You know, just coming into a company like ours for the first time, who are the potential users of a data platform? And the first step was really just felt quite natural to come up with a set of user personas. And so, that was, again, there was no kind of, I guess, having come from a product management background, there is obviously some guidance around how you create persona mapping; that’s something that certainly came in useful. So, for example, one of the things I didn’t want to do was create personas directly along job titles, so the data scientist persona or the data analyst persona.
And the reason why is that between different teams and the different geographies, a data analyst could be very, very different in one country to another, and [also trying 00:33:31] to size the different personas as well. So, that was kind of like a natural first step to ask ourselves, who are the different users, and let’s try and make that in a simple way. And also—perhaps this is quite hard to some coming from a data background like me—being okay with estimating and roughly guessing. You know, for example, one of the personas that we have is what we call the consumer in [unintelligible 00:33:57], so this is the kind of user that is usually non-technical, does not write SQL, usually accesses the data platform almost exclusively via the data visualization layer. And so, in terms of actually sizing that, you do a little bit of checking our internal HR directory, we have an internal software like most companies do, so there’s a combination of that and also matching that against the visualization usage stats.
So, looking at people from a purely just viewing the dashboards and not actually creating or writing any scripts or developing anything. And so, between that, you kind of then get to a rough figure out well, okay, this is the size of this particular persona. We then went into the next steps, but that’s just to give you a flavor of how we approached this problem.
Brian: Who did that work, though? Is this all you in isolation or like, tell me about the staff or your team? Did you have user experience people involved in this? I mean, this is all, like, core research kind of activity that often product management and UX does together. Or did you do this on your own? Like, tell me a little bit about the team that did this work.
Osian: So, it was me driving. It was kind of my—it wasn’t really a project that someone said, “Hey, Osian, you should do this.” Like I said, it was me coming into a new company. And these were questions that I felt that I naturally had to ask myself as a product manager. Because without understanding who our users are, what are they trying to do with data and what is the current state of our data platform—so those were the three main things that I really wanted to get to the heart of, and connecting those three things together—it was mostly done between myself and there was a business intelligence leader at the time, and the head of data engineering at the time, as well.
So obviously, it was something that I did myself, but it was really not. It was way more than only me in terms of I counted on so many other people to help me build this picture that is not a static picture, obviously, that is constantly still evolving. And I still do make tweaks to this, even to this day.
Brian: When you look back on this process and going through all of this, I’m sure you learn some stuff. You made some comments about, like, making mistakes along the way. It’s like, “Amen. That’s how we grow,” right? Tell me about, like, if you were to start this over at Stuart or another company, greenfield environment—or maybe not greenfield; there’s already, you know, something in place and they’re bringing you in—where would you get started? Is there anything that you would do differently in hindsight, now that you’ve done this several times? Tell me a little bit about what you’ve learned and maybe what you wish you had known a year or two years ago?
Osian: Sure. So, I mentioned around the personas to start with, I think that’s didn’t seem to be really heavy piece of work. It was something that, you know, we could create estimates, we validated it among ourselves within the data platform team and it made sense. The bit where got really tricky was looking at all the different use cases of the data platform, just because they’re are just so many. And I think we were a little bit too—or I was—a little bit too ambitious with this in really kind of listing out all of the, kind of, atomic use cases that so many of our hundreds of users had in using our data platform.
And if I was to do that again, I would take a much more, kind of, high-level approach, take more of a step back and just summarizing the very, very high level, kind of, needs or use cases of a data platform. And this is the advice that I would give to anyone who is taking on the role of a leader of a data platform or a similar role is, you can easily get overwhelmed by just so many different use cases. And so, I would really encourage to avoid that. I think that’s probably one of the main, kind of, learning points from my perspective. And when we then went to look at—look more inwards on what the data platform is offering to serve those different use cases, what I then thought was really important was to kind of look at the data platform from an end-user perspective because what often happens—and I can bet this happens in most companies because I’ve moved around quite a lot over the years—is when someone asks, you know, “What is the data platform? What does it look like? What does it do for me?” Almost 90% of the time, you’re going to get some kind of a technical architecture diagram.
And what I’m trying to do was to kind—of course, we need that, but that is not what, kind of, the business and the end-user needs in terms of what can the data platform deliver to me. And so, what was important at that stage was to, well, really look at your data platform from an end-user perspective and almost think of it as if you were to put the data platform on a supermarket shelf, what would that look like? And so, for each of the different components, how would you market that in a single one-liner in terms of what can this do for me? And that really helped to, kind of, communicate what we’re doing to kind of the outside world beyond data engineering and analytics engineers. And that was a really kind of like an important step.
And that’s also kicked off this mindset of thinking like a product because for each of those products on the supermarket shelf, you want to measure their success in a different way. So obviously, the data warehouse, the data lake, what we care about there is performance, mostly. It isn’t really a UI, maybe, for most of those, and certainly not in our case. You know, you just plug in whatever SQL Editor you want and then you do as you please. Whereas something like the data visualization, we cared a lot about the adoption, the satisfaction, all of these kinds of user experience kind of metrics.
And so, all of this started to kind of really shift us more into looking at our data platform as a product that was serving hundreds of users and it’s still serving hundreds of users every day.
Brian: I like that framing: “What can it do?” versus, “What can it do for me?” I think that’s at the core of, like, product, right? Where ‘me’ is actually not Osian, it’s not Brian, but rather, that’s the thing that the people we serve are saying to themselves. They don’t really care what it can do; they just care what it can do for them. So, having that, if you can get your head around that and put that at the center of your work and work backwards from that, I think you’re probably on the right track. I don’t know, do you agree?
Osian: Yeah. And I think also that helped us to identify the really obvious gaps because—well, this is a bit more of a recent development, but we are moving more towards kind of a data mesh setup where we want to be a truly self-service data platform because we feel that providing our different analytics teams with as much autonomy as we can, albeit in an aligned way, is the only way that we can keep up with the analytical needs of such a fascinating business like ours. And so, this idea of increasing a self-service data platform is super important to us and so really thinking about it as a family or a collection of different components is super important to us. And when we did that, it really highlighted that there was a missing piece—it’s still missing at the moment—which is around data ingestion. So, we currently don’t have a standard product component within our data platform ecosystem that allows anyone in any analytics team to ingest whatever data they need into the data warehouse.
At the moment that still [winds 00:42:36] in the hands of, kind of, the central—I don’t like to use the word central because that’s not our setup anymore, but it does rely on the more technical users that [all 00:42:48] teams being left to their own devices, which obviously creates problems of itself because there’s no alignment. And so, simply mapping out those different components [unintelligible 00:42:58] to kind of identify, well, there’s a clear gap here. And that’s actually a focus for us at the moment, to develop a product that enables anyone, even those without engineering skills to be able to ingest data because we feel that ingesting data is something that, you know, you’re just simply wanting to take data from one place to the other and obviously, you schedule that in whatever way you want. Why should you have to rely on a data engineer that might take two months to get back to you because there’s so much, so busy working on other stuff, in order to wait for that to happen?
Brian: Osian it has been great to talk to you. You’re the head of product for the data platform at Stuart. Two last questions: any closing thoughts or anything you’d like to share with the audience, and secondly, how can that audience find you? Where it’s a good place to connect?
Osian: I guess my final thought would be, as I mentioned at the beginning, don’t forget that all of our efforts in analytics and data platform has a user at the end. And I would encourage anyone who is in a data platform team to not forget that. And in terms of finding me I’m on LinkedIn. I’m on Medium.
I have quite an unusual name, so I’m generally quite easy to find. My name’s Osian Jones. And, yeah, you mentioned, I’m the head of product for data platform at Stuart. And so, if this has been interesting to listeners, I’m more than happy to jump on a call, to answer questions, and to discuss some of the topics that we’ve talked about in more detail.
Brian: Great. Well, thank you so much. And for those listening who maybe forget to go back and Google it, it’s O-S-I-A-N. It sounds like close to ‘ocean.’ Osian Jones, thank you so much for coming on the show here and I wish you all the best at Stuart and beyond.
Osian: Thanks to you.
Brian: Thank you.