137 – Immature Data, Immature Clients: When Are Data Products the Right Approach? feat. Data Product Architect, Karen Meppen

Experiencing Data with Brian O'Neill (Designing for Analytics)
Experiencing Data with Brian T. O'Neill
137 - Immature Data, Immature Clients: When Are Data Products the Right Approach? feat. Data Product Architect, Karen Meppen

This week, I'm chatting with Karen Meppen, a founding member of the Data Product Leadership Community and a Data Product Architect and Client Services Director at Hakkoda. Today, we're tackling the difficult topic of developing data products in situations where a product-oriented culture and data infrastructures may still be emerging or “at odds” with a human-centered approach. Karen brings extensive experience and a strong belief in how to effectively negotiate the early stages of data maturity. Together we look at the major hurdles that businesses encounter when trying to properly exploit data products, as well as the necessity of leadership support and strategy alignment in these initiatives. Karen's insights offer a roadmap for those seeking to adopt a product and UX-driven methodology when significant tech or cultural hurdles may exist.

Highlights/ Skip to:

  • I Introduce Karen Meppen and the challenges of dealing with data products in places where the data and tech aren't quite there yet (00:00)
  • Karen shares her thoughts on what it's like working with "immature data" (02:27)
  • Karen breaks down what a data product actually is (04:20)
  • Karen and I discuss why having executive buy-in is crucial for moving forward with data products (07:48)
  • The sometimes fuzzy definition of "data products." (12:09)
  • Karen defines “shadow data teams” and explains how they sometimes conflict with tech teams (17:35)
  • How Karen identifies the nature of each team to overcome common hurdles of connecting tech teams with business units (18:47)
  • How she navigates conversations with tech leaders who think they already understand the requirements of business users (22:48)
  • Using design prototypes and design reviews with different teams to make sure everyone is on the same page about UX (24:00)
  • Karen shares stories from earlier in her career that led her to embrace human-centered design to ensure data products actually meet user needs (28:29)
  • We reflect on our chat about UX, data products, and the “producty” approach to ML and analytics solutions (42:11) 

Quotes from Today’s Episode

  • "It’s not really fair to get really excited about what we hear about or see on LinkedIn, at conferences, etc. We get excited about the shiny things, and then want to go straight to it when [our] organization [may not be ] ready to do that, for a lot of reasons." – Karen Meppen (03:00)
  • "If you do not have support from leadership and this is not something [they are]  passionate about, you probably aren’t a great candidate for pursuing data products as a way of working." – Karen Meppen (08:30)
  • "Requirements are just friendly lies." – Karen, quoting Brian about how data teams need to interpret stakeholder requests  (13:27)
  • "The greatest challenge that we have in technology is not technology, it’s the people, and understanding how we’re using the technology to meet our needs." – Karen Meppen (24:04)
  • "You can’t automate something that you haven’t defined. For example, if you don’t have clarity on your tagging approach for your PII, or just the nature of all the metadata that you’re capturing for your data assets and what it means or how it’s handled—to make it good, then how could you possibly automate any of this that hasn’t been defined?" – Karen Meppen (38:35)
  • "Nothing upsets an end-user more than lifting-and-shifting an existing report with the same problems it had in a new solution that now they’ve never used before." – Karen Meppen (40:13)
  • “Early maturity may look different in many ways depending upon the nature of  business you’re doing, the structure of your data team, and how it interacts with folks.” (42:46)



Brian: Welcome back to Experiencing Data. This is Brian T. O’Neill, and today I have one of the Data Product Leadership Community’s founding members on the line, Karen Meppen. And how are you, Karen?

Karen: Hey, I’m excited to be here.

Brian: Yeah, yeah, we’ve been talking about doing this for a while, and we had some of those sicknesses and reschedules and all that, so I’m really, like, stoked that we’re finally here, and I get a chance to, like, put your brain out there on display, or into the ears of listeners here. You’ve been a great contributor to the community, and I like that you have strong opinions, you’re willing to share those because I think sometimes people in this, kind of, new space are sometimes afraid to share that. So, that the community is all about, like, opening that up. So, I really appreciate that you throw out your ideas unapologetically [laugh].

Karen: Absolutely. Isn’t that the point, to speak from what you know, and strong feelings of—becau—[laugh] I fail often because I’ve been there, and I half joke, you know, I have the scar tissue to show it.

Brian: [laugh] Yeah, yeah. So, coming up in February, you’re going to be doing one of our monthly webinars and giving a talk, and then we’re going to have a group discussion. And the topic was really interesting to me, and has to do with this idea of immature data, referring to immature technology infrastructure, and immature clients, and should we be taking this data products approach, or should we be trying to build data products? So, right there, I’m assuming that it means that’s not always the right thing to do, but I don’t know if that’s the case, so I just want to unpack the title because I’m already curious, just hearing about it. What’s behind the title? Is there some clear parameters for you where that’s not the way to go?

Karen: I mean, yes, absolutely. And that’s what we’re going to talk about more. I already have the framework going of what I want to talk about, and I’m definitely leaning into, hey, if you’re going to call out that I’m known for strong opinions, lean into that [laugh]. And so, the first implication is that immature then would qualify that there’s some type of maturity scale, and what does that mean? And that’s something that I wanted to unpack of what immature looks like, what does that feel like, and then translate that to the reality of how do you know when you’re ready, as an organization, as a team to transition to pursuing data products, or getting started doing it.

And that’s really the point of making sure you’re positioned for success, which is something I’m pretty passionate about, which is, it’s not really fair to get really excited about, often, what we hear about or see on LinkedIn, or hear about at conferences, get excited about the shiny things, and then want to go straight to it when your organization is not ready to do that, for a lot of reasons. In the context of talking about people, everything from leadership to those that are hands-on working with the data, to your processes, and then the supporting technology, and how that all fits together to make sure you and your team are successful to serve your goals, which is usually some form of your strategic goals to save money, increase revenue to some capacity, or the most fun one is to open up new lines of business value streams and such, using what you’ve learned from your data.

Brian: Yeah. I think even before we could talk about a maturity curve here, do you want to give some rough definition of what data product is, just for the context, because obviously, there’s lots—

Karen: Oh boy, that’s [laugh]—

Brian: Of different ones. And I know that’s a tough… [laugh] but just for your worldview for this conversation, when you say it, are we talking about reusable data assets like building bricks that someone else then technically uses to do something, or it’s an end-to-end solution, or it’s an analytics product? Like, tell me just, kind of, your framing?

Karen: I say usually, off the cuff if I’m asked, that a data product is something that can directly improve your P&L or how you do business—and that’s in the context of looking at it from an enterprise standpoint, working with businesses—however, I admit that this is a popular hot topic that we’ve learned in the DPLC, which is everyone seems to have a very distinct understanding of what data product is and what it means to them. And obviously, it’s through the prism of how they work and how they experience data products. So, I’ll admit I’ve learned a lot and didn’t realize there were so many different ways to define data products, so that’s a fair question. And the short answer is that a data product, I think, is the output of what I’ll call a data ops platform, meaning that you can work with agility, and you bring from when data comes into your ecosystem to when it’s output to an analytics product—in my humble opinion—of something that can materially change or affect behavior, often through information that is in a decision, but that’s really analytics outcomes that tell you something you needed to know at the right time, and it’s interacting with somebody in the way that they need to absorb it best. Which hits on something I know very well from all the material that I’ve learned about from you, from user interface, user experience, it’s the totality of an analytics product that affects the business in some way as it relates to a person.

Brian: Got it. Summarizing that, it sounds like it’s a fairly end-to-end thing as opposed to something that stops at a platform or something that’s servicing someone that’s one to two hops or more away from a final consumer of it—a business user, a paying customer, a vendor—who are the final human-in-the-loop, so to speak. Is that more or less correct? I mean, even it’s not a hundred percent of the time correct, is that broadly your framing?

Karen: I mean, the context of how it’s visualized or delivered, we can get pretty pedantic on it—

Brian: Yeah.

Karen: —or we have in the past [laugh], but yeah, it’s presenting analytics in a consumable way. I think the data product is the outcome, which is the information you need to know when you, as a persona, need to know it. I added in because you introduced it as well, of talking about where does technology come into play. What I’ll call the data ops platform, which is kind of the next iteration of the SDLC DevOps type experience, I think that that’s a part of it. I don’t think that that’s necessarily something that you have to have, but it definitely makes it all work and come together when it’s working properly. I won’t go into a discussion about data mesh or anything else, but I think that accelerates your ability, time to value type conversations.

Brian: Got it. Got it. So, now that we have that playground or that, kind of, definition to work with here, can you give me a concrete example of what would be a signal that applying this kind of product mentality to the work does not make sense? Are there are some really clear, easy signals to read where—and just put a framing, too, Hakkōda, you guys are a Snowflake integration part—like, you do Snowflake integrations for large enterprise businesses, so you’re talking a lot about this from a client services perspective, I’m assuming. Is that correct?

Karen: That’s right. I do work for Hakkōda, and we are a pure play, or work exclusively with Snowflake cloud platform, and all the totality and the ecosystem of other technology partners that work with Snowflake. And so yes, the context of my conversations are typically with larger organizations that have an unmet need, really, as it relates to working with their data and the experience in the organization. And so yes, I feel like I’m about to go into some variation of, I think it’s Jeff Foxworthy: “You might be a redneck when—” of, like, “You might not be ready for data products if—” [laugh]—

Brian: [laugh].

Karen: —like, I’m about to go down this [clear throat] rabbit hole. But uh—but yeah, I mean, if you do not have support from leadership and this is not something that your leadership is passionate about, you probably aren’t a great candidate for pursuing data product, just way of working, really. If you don’t have a clear understanding of what your strategic goals are and being able to pull that through, you may not be a great candidate for a data product. If you have challenges getting to the table for some form of a design-thinking sprint—and just give context or background, a design sprint is something that—I know you have a lot of great material about it, Brian—but something that in a structured format, brings people away from asking for requirements head-on and then helps folks walk through with all the key players in the experience of whatever you’re discussing.

So, for example, talking about improving our interactions with customers, for example. So, having somebody who works with customers, on a day-to-day basis, someone who does the finance and accounting of the output of your client interactions or customer interactions, someone who manages the technology who has depth of knowledge of how that’s influenced, and all of the subject-matter experts involved, giving them a structured approach to get what you need to pull together: first, understand your problem, identify the intersection of the persona with the highest value opportunity and the intersection of the persona or the person who you’re looking to improve interactions with, and that point in the process and focus on understanding what would be truly better, most high value decision or output for data that would improve these interactions. And so, for example, getting everyone to agree to get into a room to go through this exercise, or some variation of it, if you can’t get them in a room to do that, maybe there are greater issues [laugh] that you have to work through—

Brian: Right [laugh].

Karen: To even get that far. And that’s usually a tell that we need to do more of the hearts-and-minds selling or communications about what we’re trying to do and why it matters. And those are usually you know, the people, processes, the technology, I think you can do various forms, you can do it faster and better depending upon how you are working, but that’s not necessarily a blocker to create a data product by any means.

Brian: Yeah. I’m curious to the—your clients that come in, is this a situation where they’re saying, “Hey, we need some help with our Snowflake integration, and we want to build data products,” or it’s more like they’re talking about Snowflake, and you’re maybe bringing in that language and method, if it’s appropriate based on the smoke signals you’re getting in the comms? Like, does it work both ways or—do people coming in asking for this from you all? I’m just kind of curious. Because I’m hearing—I just saw a survey from Tom Davenport, and Randy, I think they said, like, you know, 500 CDOs, they interviewed and, like, 80% of them said they have some level of effort going on with data products, which makes it sound to me like a lot of places. This is, like, very common now. So, I’m kind of curious if you’re getting asks for it or not.

Karen: It’s funny, you mentioned that. We recently, as of last week, someone actively asked for a data product, which of course, I lit up, like, “Wow, you know enough, you can ask for it,” which is fantastic. Or maybe it speaks to how diluted it’s become because everything seems to be a data product.

Brian: Yeah, yeah.

Karen: So basically, it seems like as everyone uses the word to serve their own needs, the definition itself has become a little unclear, and so any version of data, a data set, and how you get there seems to be called a data product, which is frustrating at some level, but fine. We’re getting closer to understanding that data has value, so cool. I’ll take it. But before that, yes, whenever we have clients that come to the table, they do not ask for data products, and I have to be very intentional to not use it in our initial conversations because just as everyone has that very subjective definition of data products, that’s also how we get lost in the conversation of what the heck I’m talking about in the first place. What normally happens is that we have some technical leader who’s recognized that after doing some preliminary evaluation, that Snowflake is the platform that will best support the problem that they’re solving.

And then as we start going through more question and understanding the problem, one, I think you’ve noticed—and I think you’re—I’m going to destroy the quote, but, “Requirements are just friendly lies.”

Brian: [laugh]. Yeah.

Karen: Which I love so much. But more often, folks go to a solution and then focus on the solution rather than understanding the problem. And that’s more where, when I’m starting to discuss the problem that they’re solving, if they’re up for it, taking a few steps back to unpack it and understand, are we asking the right questions, first, and then once we solidify, okay, have we framed the question properly or is this a solvable problem? Now that we’ve backed into it, often [laugh] they have [laugh] awareness where, oh, okay, I hadn’t looked at it that way. If we get that far, then it’s more a matter of evaluating a lot of the ‘you might be a good candidate for data product’ approach by seeing what’s going on, and if we do have enough of the org change management levers to allow us to pull together our first data product MVP, something small, high-value, attainable, that everybody can get on board with often, that’s not the case.

And that’s where I realize, hey, let’s take a few steps back and maybe focus on understanding the data better. And often, that’s usually a good entry to further understanding opportunities to improve. For example, I’ll—I use Susan Finance, right?—you know, if we start by taking a frustrating dataset that she works with, usually there’s some form of taking an existing report in whatever your reporting platform is, downloading it to Excel, and then doing all the things you really wanted to do with it. So, once we start unpacking that and understanding the problems of why she got to downloading it in the first place, and all the points before and after what she’s doing to do the thing, whatever it is to create a report or to support some type of ad hoc ask, then you start, one, doing some relationship-building where [laugh] everyone is like, “Wow, all right.” There is opportunity, there’s hope where we can improve my current-state situation where often someone’s checked out or lost confidence in their tech team.

And so, that’s an opener to see what opportunities we have and build advocates for the power of data and taking a more efficient approach, which can be a design-thinking approach to strategically or in a structured format, working through problems in a repeatable way to make sure Susan’s getting her needs met, the reports are going out earlier, or the information that needs to be known is known earlier and in a way that works for said CFO or whatever. So, that’s really what happens. They don’t come in the door saying, “I want to do—start up my data product practice within my organization,”

Brian: Right. This is almost, like, two different questions, and you can pick either one. Do you find that there’s traits that your clients have that are ready to do this kind of work? Or you could also answer if you’re feeling like they might not be ready, do you do some challenging? And have you found a method for you that helps you? For example, how do I get access to Susan? She’s the key. If I can get access to Susan, I can make John or whoever—I know John is going to be happy. The key is Susan. How do we get to her?

Do you even navigate that, or are you just like, “Nope, you’re not ready for this. It’s not worth our time because there’s bigger issues.” Or—so I don’t know either one of those: what’s a trait of a good client look like, or how do you navigate through to make a good client out of one that maybe came in who wasn’t good on the surface from a product perspective?

Karen: Wow, great questions. And yes, I, depending upon the nature of how things are going, I may go one way or the other. Yes. Something that’s not new to everyone is the concept of shadow data teams, so if you haven’t heard the term before, a shadow data team is typically your Excel jockeys, the ones who are like Susan, who downloads from the reports that are available in whatever report platform you’re using, and then does great things just by moving and grooving within Excel.

And what’s common often, for a lot of reasons, is that the shadow data team becomes frustrated and stops interacting with the tech team because in previous instances, they tried to change existing report, and often there’s a disconnect because the tech team does not understand what the business user is asking for, and so they do their best to deliver the change or deliver exactly as it was requested, and—

Brian: [buzzer sound][laugh].

Karen: Creates more frustration, and it just perpetuates itself. Right. And so, typically because of the nature of how I’m working with folks coming in with Hakkōda, we are working with Snowflake, somebody’s decided, “Yes, we want to use Snowflake.” And so, then I’m working with somebody who is a, usually some form of a data platform leader or interacting from that capacity. Not always, but that’s where the discussion originates.

And so, then that’s a part of understanding, often, first, clearly identifying the nature of the tech team. Do they see themselves as a cost center? Do they see themselves as an order taker? And first establishing that. And so, if they only see the levers available to them and the tech team as being improving operational efficiency or saving money, then that’s—well, the first thing to consider in terms of how they see data and how they interact in the organization, but also, if they’re an order taker, meaning they’re receiving tech requests to basically support all the other business units, then that also reflects the nature of how they sit, and interact, and their opportunity to do more and to get buy-in from Susan in finance, for example.

And so first, my checkbox is typically, what kind of tech team is it that I’m working with, and then based upon that understanding, if we do have a little bit of challenge in terms of how the tech team is perceived or how they work with Susan in finance or the shadow data team, then it’s a matter of finding trust and the opportunity to reach out to others in a structured format, ask about said problem and making sure that Susan is getting what she needs as it relates to interacting with Snowflake. And sometimes, that is all I need to get started as part of our understanding of, typically we’re doing either, improving an existing Snowflake environment or taking somebody net-new and walking them through the experience and making sure it’s a positive experience each step along the way. And as part of that, to make sure we’re engaging the folks who will be on the receiving end to consume whatever is put out by that data that’s been landed into Snowflake, and that’s a great opportunity. What I have observed when it’s… that’s not an option is, when there’s someone in the tech team that is concerned about even being able to get a meeting, and the effect of what that will be as to even asking questions of Susan, and the consequences of that. That’s usually my tell of, hey, it sounds like there’s a little bit more work that needs to be done to build bridges that’s beyond what I’m able to do [laugh] with that team.

And so, that’s more really understanding the nature of how everybody’s working at the moment, and how I can best partner to show—I call it the WIIFM: What’s In It For Me—to help those within the data team to make their lives better as it relates to their current processes and optimizing that through their experience working with us, but then also with, effectively, their customers of what is Susan trying to do, and what are her—it’s your empathy mapping, right?—what are her concerns, what are her frustrations, and how can we coordinate that with what it is that we’re doing with our moving data or optimizing existing Snowflake implementations to help serve her needs where they aren’t being met at the moment?

Brian: Let me play seasoned technology leader on the other side of the conversation. “Yeah, Karen, that’s great. We actually—we already talked to Susan. We already got requirements from her team. We know what they need already, so we don’t really need to go do that. Like, what we really need is, I need it to do X, and it needs to do Y, and latency needs to be this, and it needs to have this kind of SLA, and it needs…”

Where do you go with those conversations? Do you kind of let that go? Do prod a little bit? It sounds like you want to represent the voice of Susan at that table when she’s not there. How do you dance with that? Like [laugh]. Or do you? Is that—I make up a fake scenario that doesn’t really happen? I don’t know if that you have the loud tech bro or a seasoned tech, like, leader?

Karen: No. I mean, some version of that has happened for me as well, and I’ve had feedback of, “We have business analysts, and that’s their job.” So, I’ve worked with that of, “Okay, great.” I don’t want to keep anybody out of the experience. It’s a party together. Let’s all collaborate and bring them on board. If there’s feedback, which it’s not unusual of, hey, there’s the process. We establish what is the problem we’re solving, we collect requirements, we ask them, they tell us, and we deliver it, right?

Brian: Why are we making this complicated [laugh]?

Karen: Exac—“Don’t overthink it, Karen. Come on. I don’t understand what we’re talking about here.” And that’s usually more where… of course, I want to make sure and honor what you’ve already done, all the hard work it took to get to where we are now, and then that’s more where I say, “Great. All right. So, then I’d love to meet Susan and talk through it together, if that’s an option, to first ask that directly of just—to further understand just to make sure that we’re all on the same page, and if possible”—and this is something that I learned from… from you—I learned it from you, Brian—

Brian: [laugh].

Karen: That I get visual prototyping. I mean, I think the greatest challenge that we have in technology, it’s not technology, it’s the people, and understanding how we’re using the technology to meet our needs. And so, when you understand that technology is the enabler and not the outcome, then taking two steps back and say, “Okay, let’s make sure we’re on the same page.” Because communications is a very difficult thing, and you can be in the room talking about even these requirements that you’ve already collected, and everybody in their mind has a different vision that’s playing out of what that looks like, and how do you get some clarity or disagreement on the table of like, that’s not what I meant at all, and so you can move forward and unpack that in whatever way—usually, I try to understand, like, hey, do you use Miro? If not, okay, well, what else do you guys use?

Another fun one is, like, “What do you mean, what do we use for communication? We use our agendas.” It’s like, “Great, okay,” for communicating back and forth and try to work with existing tools in their environment where the data team and Susan are, and figure out a quick way to get visual. And maybe ask Susan—and you do this all the time—hey, can you walk me through said situation when this is happening instead of asking for a direct question to say, “Hey, let’s do a ride-along and just talk through what your experience is.” And to do the validation of they’re up for it, or some type of visual of—and I’ve done this, too. “This is what I’m thinking, and I may be very wrong. Can you tell me?”

Brian: [laugh].

Karen: “What do you see here?” I mean, and that’s really the point more is that the earlier you get in the conversation to validate what we’re trying to do, why we’re trying to do it, it’s the five questions, you know, of, what is it going to look like when we’re done? What will be different that you weren’t able to do before? Things like that, and get into those questions. And getting back to your original point, yeah, it’s not uncommon to say, “Yeah, we already have requirements, let’s go.”

If that’s not an option, then, hey, I’ll ask and yield to whatever response I get. If I don’t get the go ahead, I’ve learned also, there may be many reasons well beyond what I’m trying to clarify that could be going on that I just don’t know about, so that’s okay. Keep working and see if other opportunities present themselves the more I get to know, the environment, the people, the culture, if you will, to see how to better interact with folks. I’m also prone—[laugh] as you mentioned, strong opinions—to step right in it when I ask questions of, whatever the taboo or off-limits thing is, I just go straight for it [laugh] and bring that up, as well as part of asking questions, and learn that way as well.

Brian: Isn’t that what they say in, like, therapy? The thing you don’t feel like talking about is probably the one you need to be talking about? [laugh].

Karen: Right. I have good spidey senses, and sometimes they’re good ones [laugh] that I go straight for it when I start discussing, and then not realizing that it’s a bigger issue than at least what I thought it was when I asked the question.

Brian: Yeah. If you believe in all this approach, is it because you seen it, it just makes sense to you, or you have some, like, before/after story, or you’ve had, like, a recent—you can recount a recent, call it a win or some experience where a client maybe was resistant to this. You got Susan into the fold. Somehow, light bulbs went off. Like I don’t know, and maybe it’s not like just linear and, like, oh, there’s bad and there’s good and this was good.

But can you recount any story that might be useful for our listeners to know about in terms of dancing with this, and maybe getting access to this person or getting the team to even begin to give access to someone like Susan because they realize, like, at the end of the day, if Susan doesn’t give a shit, frankly, then it doesn’t really matter. We just spent a ton of money on Snowflake and it’s going to sit in the cloud. [laugh] literally in the clouds.

Karen: Yeah, I mean, wow, there’s so many stories. I’ll preface that I’ve been working with data for over 20 years, and I started off working with a team of developers to create a custom application for hard money lenders in Las Vegas—you can Google hard money lender on your own there, but just to give context of a very, very niche type of application—we didn’t finish the project ever. And what happened was, I didn’t know enough in my first experience of overseeing delivery of it being more of collecting and validating requirements, what it looks like for delivery, things that we know now, oh, everybody knows what Agile and Scrum are. That wasn’t so much of a thing—I’m old enough to say—when I first started. And then since then, I’ve gone through variations of delivering technology solutions, and have learned every single time, the issue comes when it’s either not communicating well of what the end will look like, and further, than the perception of the people that are offering requirements and what will happen to my job afterwards, all of it always comes down to the people.

And I have lots of scar tissue by following many other approaches that were just, no, this is how we do it. For example, asking for requirements. I have my PMP and followed a traditional PMP approach, which I know can be a controversial topic. I have background as a scrum master as well. And understanding all the component parts of what you need to do to come across the finish line, I’ve learned a lot of the ways that don’t work [laugh] to arrive at where I am so passionate about data product, which is the holistic approach to quickly sew together what needs to be done.

So, the example I’ll bring up is a client that I worked with, and that they agreed on Snowflake. And they came to the table and wanted to start with a sample report that was causing operational issues, latency on their existing platform. I don’t want to name-drop all sorts of tech examples, but I’ll say it’s—

Brian: That’s okay.

Karen: But they’re from another platform. It was… the problem is that it was something used at month end, and it’s something that everyone used as the metric to see if they were hitting on their success metrics for forecasts. And so, the solution was to optimize the platform of addressing the latency—and we did—as it related to it needing over five minutes to run, and that there had to be a lot of offsetting of resources or managing what the stable of expanding out or pointing to whatever is being needed to run the queries and the data, et cetera. So, that was done. The challenge or the goal was that’s the start of pursuing, I’ll say, a version of data products, and that working with greater agility, and trying to pursue the concept of data ops platform so that there could be more of a relationship or value within the technology team.

What happened prior to our arrival was that these shadow data teams went and hired their own data scientists [laugh], and started working on their own, and asked for access to the data and then just started creating their own intelligence and data products and agents. And so, there was a disconnect between the tech team trying to catch up with the different departments and bring them bringing them back to the table after the horse left the barn. So, [laugh] after all the access was granted to these different groups, then that’s more, how do you rein everybody in? And for example, we’ll say Susan, Susan was trying to qualify what needed to happen, or why we even wanted to talk to them because they’re already doing what they wanted to do. And then separately, the data team was very clear on what they’re here to do, and there was a hesitation to reach out to them without clarity on what we’re asking for from these folks who have already kind of taken off because they already were aware, effectively, of the challenges.

And I think that’s more where in this case, we realized we need to—or have—I tried—I wanted to have a conversation to do a design sprint, and I went to the tech leader to say, “Hey, let’s bring them back to the table to understand what’s going on. And we already know what the goals are for the year, the strategic goals. Is there a big problem that you have on the list that we can kind of work through together?” And so, I painted the picture—get visual, right—of these are the folks, based upon what I know, of who would be a part of the design sprint. And their feedback was, “We have people who collect requirements already. If we’re not getting what we need, you need to tighten up your project management approach.”

And so, I think there was greater hesitation to ask for all the folks to come together to have the conversation. And that was more a learning experience for me of, like, I have my own—I call them my Karen retros—of like, man, what would I have done differently [laugh] there?

Brian: Karen retros. That’s awesome.

Karen: In that instance of, I see the opportunity, they’re bought into the concept—clearly—in some way, of getting to the data, but how do we get—or bypass, really, the chasm between the tech team and their perception, in which case in this case, they spent more time explaining to the folks who had access to the data what it is, as opposed to actually facilitating the capture of their analytics requirements. And that’s just something that I’m seeing a lot, which is more on one side, a lot of folks that had been working with data platforms their entire lives—and achieve success, obviously if they’re in leadership roles—being asked directly of how does what you do and your group tie to the strategic goals, often ruffles feathers [laugh]. Where I think that’s a common or reasonable question, that that’s something that, to your point, goes straight to the heart of how are you providing value. And if you can’t explain it, that’s usually something that, often for someone who has been working with technology, is foreign or just not something that they do on a day-to-day basis. And if you’ve been in more traditional tech environments, you don’t see that as your role either.

Brian: Yeah. I wanted to ask you kind of a closing thought on platforms. I have this theory that if you think of data products as platforms and reusable assets, that the chance of delivering outcomes, like, improving people’s lives and fiscal value is less; it’s harder. And the reason is, if you’re one or more hops away, then you live in the world of classes of problems that you need a platform to solve, not instantiations of specific problems. So, because it’s classes of problems, it’s easier to imagine, well, someone might like the calculation monthly, so we should include that, too. And with this kind of thinking, it’s very ea—

Karen: Build it, and they will come, type thing [laugh].

Brian: Well, right, it’s very easy to imagine that. And in some ways, you kind of have to do a little of that, but validating any one choice is hard, and it doesn’t account for the fact that, like, we can make things worse as we keep adding stuff. We can actually make things harder or worse. So, I don’t know. It’s just a theory I have, but I was curious to throw that at you.

And I—just, kind of, closing thought—any closing thoughts on that? I’m wondering how much you all work backwards from instantiations of problems, or you get requests to build Snowflake implementations to solve classes of problems, like, “Well, we have these finance things that need to happen at the end of the month, and finance people need customer data, product data, and whatever data, and it’s really slow, so can you make it faster?” Like, but we really haven’t talked about what Susan is doing. We don’t really know whether or not we made an impact on Susan’s life, which will then turn into either cost savings or maybe additional revenue or whatever because we’re talking about classes of finance department problems. I don’t know. Any thoughts on that, being someone that works in platforms?

Karen: It’s a great question. And a problem I often have is that after talking to folks, sometimes I arrive at seeing or understanding the underlying problem, and then taking the folks we’re working with in a direction that is solving a problem that they haven’t recognized yet. And so, if they don’t recognize it as a problem, then obviously that’s pretty [laugh] difficult conversation. So, then you take a few steps back, and often it is that they’re—we’re trying to improve the latency of the report in month-end. And so, when you talk about classes of issues, from a it’s solving a problem that is they’ve identified, which is, it’s not moving very quickly, when they try to run the report, and people get frustrated when they want to see those results.

However, yes, to your point, the underlying question would be, what are you doing with that report, and how is that answering a question, whatever your goals are, from that report? I mean, that’s really more where you want to go. However, I think that’s more where when you talk about a platform, what I often identify is more, you can’t automate something that you haven’t defined. For example, if you don’t have clarity on your tagging approach for your PII, or just the nature of all the metadata that you’re capturing for your data assets and what it means or how it’s handled—I’m going to say a hand-wave—to make it good, then how could you possibly automate any of this that hasn’t been defined? And then that’s usually where you [laugh] get—or when I get interesting responses in that, I feel from a—if you’re a technologist, you’re often looking for solutions, and so, I think, often there’s the joke about well, I’ve heard many times about data contracts. So—

Brian: Mm-hm.

Karen: Data contracts are there to programmatically capture all the touchpoints of where’s it coming from? Where’s it going? What is the data need to do or the state it needs to be in to make it usable? When often, you can work a lot more efficiently by just talking to the end-user [laugh]. And so, then the question is, like, okay, well, I think it’s a start and that everybody’s trying their best.

And so yes, you’re solving problems, in this case, where there was a material improvement on month-end close reporting. And where I ended up going with them is more, okay, now that we did it, nothing upsets an end-user more than lifting-and-shifting an existing report with the same problems that had in a new solution that now they’ve never used before [laugh]. So, maybe give them something that they can get excited about, and use that as an opportunity to approach for that conversation with Susan. And in this case, we did make inroads where everybody did agree on broader cross-department, cross-business unit conversations, to bridge that gap. And that was a positive.

However, yes, it is a challenge of just moving beyond your traditional understanding of technology, how I interact with it, how that aligns with my gainful employment, and also helping others in meeting their goals and needs. And so yeah, it’s very much [laugh] a conversation of understanding everyone, and their relationships, and making sure that’s a part of the conversation.

Brian: Yeah, yeah. I really like that, during our chat, this has been a great chat with you, and I’m glad Susan was here with us, and—

Karen: [laugh] Right.

Brian: We got to talk—we got to at least talk about Susan, talk about Susan there [laugh], you know [laugh]? And that you’re having that humans-centered focus to the work you’re doing. I think that’s great. So, February 27, 2024, 10 a.m. Pacific, I think, 1 p.m. Eastern, we’re going to be giving this webinar.

Any closing thoughts on the webinar? Is there anything I didn’t ask you that you might want to share, just have any closing thoughts? I hope folks that listening here can join us. Apply for the DPLC if you’d like to come check out Karen’s talk, and then we’re going to have a group discussion. Really our talks are actually fodder for creating a discussion, so it’s a low on the talking-head part and high on the conversation part is, kind of, what we’re going for with those. But any closing thoughts on your talk or this conversation?

Karen: Absolutely. I had to pivot quickly, so my anecdote, actually, that I’m going to walk everybody through, more of the anatomy of an early maturity data product experience is what I plan on discussing in our group, and [laugh] I have some strong opinions, and putting them out there for conversation along the way of challenging, I think things that are said—I won’t say common opinions, but things you do hear [laugh] regularly among data folks as we’re going through it. And admittedly, I had to pivot quickly [laugh] when you asked your question to find another example. So, I’m sorry, it wasn’t as great as it could have been for a better articulated example. But the point is, that early maturity may look different in many ways depending upon the nature of what kind of business you’re doing, and the structure of your data team, and how it interacts with folks. But there are constants, and that’s more where I wanted to talk through that, and challenge a lot of folks in what, just because it’s been done that way, and that’s the accepted way, is that the right way? And what would you do differently through our experience in the webinar?

Brian: Cool. I’m looking forward to hearing the conversation around that. So, thanks again for coming. Where’s the best place for people to find you? Are you LinkedIn? Twitter? Where do you do your thing [laugh]?

Karen: I do my thing on LinkedIn. I think that’s really what I have time for [laugh].

Brian: Yeah, yeah.

Karen: I will be more active, but that’s really where it’s at. And that’s where you can find me and also share some strong opinions with me along the way.

Brian: Nice. Excellent. Well, Karen from Hakkōda, that’s hakkoda.io. If you just type ‘Hakkōda,’ you’ll find Karen’s company’s work there, and Karen Meppen, on I think you’re Karen M on—

Karen: I am.

Brian: LinkedIn, correct? M period. Karen M period. So, not spelled out.

Karen: Due to my privacy rabbit-hole that I’ve gone down [laugh] as well.

Brian: That’s okay. I hope I didn’t break it right now, sharing your LinkedIn secrets [laugh].

Karen: No, no. No, no [laugh].

Brian: Cool. Karen, it’s been great to chat with you, and thanks for coming on Experiencing Data.

Karen: You bet. It was an absolute pleasure, and I look forward to chatting with the DPLC soon.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe for Podcast Updates

Join my DFA Insights mailing list to get weekly insights on creating human-centered data products, special offers on my training courses and seminars, and one-page briefs about each new episode of #ExperiencingData.