101 – Insights on Framing IOT Solutions as Data Products and Lessons Learned from Katy Pusch

Experiencing Data with Brian T. O'Neill
Experiencing Data with Brian T. O'Neill
101 - Insights on Framing IOT Solutions as Data Products and Lessons Learned from Katy Pusch

Today I’m chatting with Katy Pusch, Senior Director of Product and Integration for Cox2M. Katy describes the lessons she’s learned around making sure that the “juice is always worth the squeeze” for new users to adopt data solutions into their workflow. She also explains the methodologies she’d recommend to data & analytics professionals to ensure their IOT and data products are widely adopted. Listen in to find out why this former analyst turned data product leader feels it’s crucial to focus on more than just delivering data or AI solutions, and how spending more time upfront performing qualitative research on users can wind up being more efficient in the long run than jumping straight into development.

Highlights/ Skip to:

  • What Katy does at Cox2M, and why the data product manager role is so hard to define (01:07)
  • Defining the value of the data in workflows and how that’s approached at Cox2M (03:13)
  • Who buys from Cox2M and the customer problems that Katy’s product solves (05:57)
  • How Katy approaches the zero-to-one process of taking IOT sensor data and turning it into a customer experience that provides a valuable solution (08:00)
  • What Katy feels best motivates the adoption of a new solution for users (13:21)
  • Katy describes how she spends more time upfront before development to ensure she’s solving the right problems for users (16:13)
  • Katy’s views on the importance of data science & analytics pros being able to communicate in the language of their audience (20:47)
  • The differences Katy sees between designing data products for sophisticated data users vs a broader audience (24:13)
  • The methods Katy uses to effectively perform qualitative research and her triangulation method to surface the real needs of end users (27:29)
  • Katy’s views on the most valuable skills for future data product managers (35:24)

Quotes from Today’s Episode

  • “I’ve had the opportunity to get a little bit closer to our customers than I was in the beginning parts of my tenure here at Cox2M. And it’s just like a SaaS product in the sense that the quality of your data is still dependent on your customers’ workflows and their ability to engage in workflows that supply accurate data. And it’s been a little bit enlightening to realize that the same is true for IoT.” – Katy Pusch (02:11)
  • “Providing insights to executives that are [simply] interesting is not really very impactful. You want to provide things that are actionable and that drive the business forward.” – Katy Pusch (4:43)
  • “So, there’s one side of it, which is [the] happy path: figure out a way to embed your product in the customer’s existing workflow. That’s where the most success happens. But in the situation we find ourselves in right now with [this IoT solution], we do have to ask them to change their workflow.”-- Katy Pusch (12:46)
  • “And the way to communicate [the insight to other stakeholders] is not with being more precise with your numbers [or adding] statistics. It’s just to communicate the output of your analysis more clearly to the person who needs to be able to make a decision.” -- Katy Pusch (23:15)
  • “You have to define ‘What decision is my user making on a repeated basis that is worth building something that it does automatically?’ And so, you say, ‘What are the questions that my user needs answers to on a repeated basis?’ … At its essence, you’re answering three or four questions for that user [that] have to be the most important [...] questions for your user to add value. And that can be a difficult thing to derive with confidence.” – Katy Pusch (25:55)
  • “The piece of workflow [on the IOT side] that’s really impactful there is we’re asking for an even higher degree of change management in that case because we’re asking them to attach this device to their vehicle, and then detach it at a different point in time and there’s a 
  • procedure in the solution to allow for that, but someone at the dealership has to engage in that process. So, there’s a change management in the workflow that the juice has to be worth the squeeze to encourage a customer to embark in that journey with you.” – Katy Pusch (12:08)
  • “Finding people in your organization who have the appetite to be cross-functionally educated, particularly in a data arena, is very important [to] help close some of those communication gaps.” – Katy Pusch (37:03)


LinkedIn: https://www.linkedin.com/in/katypusch/


Brian: Welcome back to Experiencing Data. This is Brian T. O’Neill. Today, I have Katy Pusch on the line from Cox2M. How are you, Katy?

Katy: I am wonderful. How are you, Brian?

Brian: I’m doing great. And we’re going to talk about data products, and you’ve lived and swam through the soup of these things before and have some lessons learned here. Currently, you’re at Cox2M; you’re the Senior Director of Product and Integration. What the heck does that mean?

Katy: I had that same question after I started, the integration part in particular is a little non-standard. At Cox2M we work on IoT products—Internet of Things products—and part of what attracted me to the role is there is a ton of data in IoT. Product, that part is fairly self-explanatory, product management. The integration part, we really emphasize having a holistic solution that incorporates the different pieces of technology that we build at Cox2M. And so, that’s the integration piece, but I oversee product management and user experience. We have five different market verticals at Cox2M that we work in across automotive, and smart cities, and all sorts of fun and intriguing spaces. So, that’s what I do at the moment.

Brian: In the past, you know, before I worked on anything that involved hardware-software—what we call IoT— my brain as a designer would immediately jump to, like, the hardware, and, fast forward, when I hear IoT, I’m just thinking, we’re talking about data products here. Like, the sensors are just, like, the infrastructure to get access to data that helps us do stuff. Is that the framing you think about as well when you hang in the IoT space?

Katy: When I first joined Cox2M, I did think about it that way. Especially in the past several months, I’ve had the opportunity to get a little bit closer to our customers than I was in the beginning parts of my tenure here at Cox2M. And it’s just a SaaS product in the sense that the quality of your data is still dependent on your customers’ workflows and their ability to engage in workflows that supply accurate data. And it’s been a little bit enlightening to realize that the same is true for IoT, right? It’s not, like, a fully foolproof interaction that you’re engaging with.

And so, the user experience in that hardware piece is still actually very, very wholly important in ensuring that the devices are being used and the processes support getting clean data from IoT devices. So, it is ultimately all about the data, but there’s really a focus on the workflow on the front end to make sure that the user experience of your product is facilitating getting the data that you then need to create the insights to facilitate the other aspect of the user experience. So, it’s all very interrelated in a very interesting way.

Brian: What you just said about workflow, this is—for me, I want to ask you if you had kind of have a definition of data product, but when I think about it, this end-to-end concept where we’re thinking about workflow, and your case, we’re thinking about data collection as well, the hardware that’s involved—and maybe you can tell us a little bit about what the sensor—what is the hardware involved in the IoT piece and whether or not you’re designing those interfaces and/or the hardware itself so that you can facilitate the downstream objectives that you have, or whether it’s third-party. So, I’d love you to talk about that, especially with—the metrics toilet syndrome that I’ve seen before, which is like, because this device captures all this telemetry, it must be useful and we should have charting to display it. And we don’t know how anyone would need it, but we’re so afraid of taking away stuff because they put it in there, so the temperature outside must be relevant to the sensor for somebody, but no one can explain why they need to know the temperature outside to do X, which seems totally unrelated. Tell me, I don’t—[laugh] is this familiar to you? Or I don’t know. [laugh].

Katy: Well, I am, I am, I’m smiling. For a specific reason. I want to back up and give your listeners the context of earlier in my career before I made it into product, and you know this, but your listeners may not. I started out as a data analyst. I came up through data analytics, and so very familiar with this conundrum from an analyst perspective of, you know, the answer is 42. Why does that matter?

Like, providing insights to executives that are interesting is not really very impactful. You want to provide things that are actionable and that drive the business forward. And so, coming into product from that background, yes, sometimes it makes me smile and reflect and try to engage in a constructive, educating kind of a conversation about, like, “Well, these conversations we hear about data monetization, like, just because data exists does not make it valuable. We need to understand what it’s driving at and what problem it's solving.” And actually, there’s some very real pragmatic constraints in IoT devices about the size of the data packets that are being sent through, so I do think it’s important to have that prioritized discussion about how often you need certain types of data being sent through and what insights they’re driving, rather than the blanket approach that so often touted of just, “Well, it’s data and therefore, we must need it for something,” [laugh] “Whether we figured out what or not.” And similarly, the data that we collect from these sensors is all for a purpose. It’s all for a specific insight that we want to derive and send through ultimately to the software user interface of our products to make use of. So.

Brian: I want to take this out of the abstract and make it physical and visual for people. What is the sensors that you’re working with Cox2M? Like, what’s the business or user workflow problems that you’re solving? Kind of what does that end-to-end, from hardware to software? Tell us, like, why would I buy whatever you’re selling at Cox2M?

Katy: We are primarily a B2B service provider and so some people may not. But we do have one vertical called Kayo, that is sold on Amazon. I’m going to focus on the automotive side of our business because there’s some nice, you know, just congruency there. So, for instance, the hardware that we sell, we call it an OBD connector internally—there’s probably something more marketing-friendly that I should be calling it—but it’s a piece of hardware that plugs into the onboard diagnostic port of a vehicle. And so, if you apply this in an automotive sales perspective, for instance, with car dealerships, what that does is it helps them track the location of the vehicle, you know, check engine lights, or things that they might need to be aware of about the vehicle’s health while it’s sitting on their lot, which we call ‘voice of the car.’

So, there’s some things that a car dealership wants to be aware of about the vehicles, the assets, in this case, and its care. From a small business fleet perspective, we also created a product called Kayo which small business owners can purchase on Amazon, actually, and there’s an app experience where you can track your small business fleet and understand where it’s been and where it’s going, send communications to the driver, you get that same sort of notification, voice of the cars, is there a piece of maintenance that needs to be done on this car, so that you encourage the drivers who are caring for your vehicles of your small business to make sure that they’re maintaining your vehicles and prolonging the life of your asset. That’s a lot of what we do. And it sounds very simple at the highest level, but there are a ton of very interesting algorithms and technical considerations when it comes into something that sounds very simple, like just keeping an eye on the location of a vehicle. [laugh]. So.

Brian: So, accuracy of that information.

Katy: Yes. Yeah.

Brian: So, you mentioned workflow earlier. So, if this is something new, help my audience understand, like, how do you start what I call the zero-to-one? How do you get to a design, if you walked in the door as a user experience person or product person, it’s like, oh, we have all these sensors that track vehicle locations and they’re supposed to be a digital product of the end that’s going to solve some business problem, and then there’s all this stuff in between that. How does one approach turning this into an experience that solves a user’s pain that they’re willing to pay money for? How do you approach that work, especially in that first iterations, in the early stages?

Katy: I wasn’t here for Cox2M’s zero-to-one. I came in a little bit later to help try to start scale it. So, I want to use an example that I was actually part of, [laugh] which isn’t IoT, but still data product-centric. So, for me, the zero-to-one comes from ensuring that we deeply understand either the decisions—and again, my background is B2B, so I’m going to talk in terms of business decision-makers and things of that nature—but the decision that a business user needs to make or the action that they need to take to improve their business outcomes. So, the first thing that I always look at is actually a holistic kind of market strategy of here’s the decision maker who’s going to decide whether it’s worth spending money on my product or not. What strategically is keeping them up at night?

If I can answer that question, and then I layer—I’m making a Venn diagram, right? So, one side of it is what’s the strategic problem that’s keeping them up at night? The other side of the Venn diagram is what are our technical capabilities or ability to gather data about that could help alleviate one or more of those strategic concerns? And then the overlapping space of the Venn diagram is where I focus my efforts on creating prototypes and taking it back to those business leaders and saying, “Does something like this help you move forward? Would you spend money on this? How much money would you spend on this?”

So, and then build that MVP as expediently and inexpensively as possible to prove out your hypothesis that this is a market worth capturing, which has taken some trial and error, right? The first times I worked through this cycle, it was painful and long, right, longer than it should have been and had some stumbles, so I certainly haven’t had perfect adoption from some of my earliest products. But over time, that’s the playbook that I’ve developed, right? Understand what people will pay money for and how that overlaps with our ability to solve problems for them. Build that as quickly and cheaply as possible, prove it out, and then gain market traction in the way that industry supports, right?

I’ve worked in some industries where you needed, like, a big bang industry event launch kind of a thing in order to gain market traction; I’ve worked and others that were more wholly based in word of mouth, so you needed a center of just really fanatical customers who wanted to talk to their peers about you. So, the market dynamics of how word spreads about a product’s usefulness are also really important to understand. And I’ve been lucky that I’ve had some really smart marketing partners in my past who have helped me figure out that side of the equation, but I don’t want to leave it out. So, the workflow aspect of that—sorry, to bring it all back to the question that you actually asked is, if you understand the decision that needs to be made or the action that needs to be happen, you also have to understand that that’s not happening in a vacuum. So, what are they doing before or after they go in search of this information?

And how can you embed your data product in the most optimal way to ensure that the data is not just actionable, but it’s also timely and it’s embedded in something that they’re already doing? Because people are busy, and they’re thinking about a million things, and they weren’t using your product before, so it’s a lot to ask them to completely change the way that they’re doing things to, like, stop and be like, “Oh, there’s a dashboard for this someplace that I should—what’s the URL for that again?” Like, “What should I be—” So, embedding your data product within a user’s existing workflow, in some way, finding a way to anchor it there has been really important. And then similarly, the workflow that I was alluding to on the IoT side, you know, again, I wasn’t here for the zero-to-one, but the piece of workflow that’s really impactful there is we’re asking for an even higher degree of change management in that case because we’re asking them to attach this device to their vehicle at some point, and then detach it at a different point in time and, right, there’s a procedure in the solution to allow for that, but someone at the dealership, for instance, has to engage in that process. So, there’s a change management in the workflow that the juice has to be worth the squeeze to encourage a customer to embark in that journey with you.

So, there’s one side of it, which is happy path, figure out a way to embed your product in the customer’s existing workflow. That’s where the most success happens. But in the situation we find ourselves in right now with IoT, we do have to ask them to change their workflow. But that’s a very, very intentional aspect of our product adoption that we have to be mindful about, providing best practices about who’s the role in your organization who we’ve seen this be most successful when it sits with, and you know, what should the ideal operation be, so that we’re not asking a customer start from scratch and figure it all out themselves?


Brian: When you’re asking these customers—and I might call them users and just use those interchangeably, but you stop me if you don’t want to do it that way—when you’re asking them to use something new, such as, “Yes, I do need you to depart and go to this URL because that’s where all this insight lives,” or whatever, is the correlation of adoption and usage purely a function of how big of a pain does this relieve for me, what the reward is? Or is there more to it? Yeah, it’s a giant pain and, like, there’s no more thefts of test drive vehicles, or whatever the reason is, like, a dealership wants this, is that still not enough? Or do you find it is, if you simply—if you can relieve a giant pain that, like you said, keeping people up at night. If you can take that away, they’re going to use it?

Katy: I do agree with that framing of, you know, there’s two sides of the equation and the benefit side has to be greater than the cost side for someone to change their behavior to do it. And I think that’s why the starting out point of what are the strategic priorities, what keeps you up at night are so important is because the reality is, you try to minimize it as much as possible, but there’s some impact to workflow, there’s some new stuff that a user needs to engage in. So, you try to minimize that, but you also need to maximize the value of the pain that you’re alleviating in order to encourage that behavior change, in my experience. And so, my very first data product that I launched was—I’m biased; I would like to say it was impressive from, like, from a data capability standpoint, but it was absolutely not embedded in anyone’s workflow. It had really poor user adoption and, you know, was a great learning experience of this particular company on our path to, you know, more advanced data products, but certainly, I wouldn’t call it a success in its own right.

And that was really the key lesson learned from that was that the dashboards were too dense, it wasn’t clear what someone was supposed to get out of it, the problem that we were solving was actually not the biggest strategic problem for the decision makers. And so, we had one very small, very devoted set of users who really valued the data that was contained in this analytics product, but to a broader extent, no one could be bothered to care that it existed, right? [laugh]. It was too great a change, it was too difficult to consume, and the juice wasn’t worth the squeeze.

And compare that to a later product, where we did the work on the front-end to understand how do people want to consume this information and what are the key decisions that they’re making, at what point in time, and really put forward an effort to having it be that point-in-time delivery, right? And that product achieved rather rapid market traction and was a great success. And that’s really the key difference.

Brian: When you talked about spending the time upfront, on the front-end, are you talking about user research and design and getting all the different technical people in the same room together with product management? Is it that kind of thing or something else?

Katy: That is a piece of it. That part’s necessary. The user experience partners that I was paired up were also different in those two journeys. And so, there’s a lot of variables that changed, but keep—so keep in mind, I was younger in my career as well, had some lessons to learn, and I also didn’t have this focus on user experience. So bleh. Kind of bleh, dense dashboards.

And the second time, being paired with someone who is very mature in their perspective on user experience and being able to translate this silly little Excel product—because I think in data—so the silly little Excel prototype that I provided of, like, this is what I’m trying to get across with this Excel prototype, and then converting it into this beautiful prototype that we were able to then go and test with customers and change before we even hit production, or you know, development of what would be our production product. Because we needed to ensure—we happened to partner with a BI platform to build this on top of to speed our time-to-market, right, but we needed to make sure that the underlying partner that we worked with was able to meet the output that we knew was going to be successful in the market. So, we actually started with the user experience and then worked backwards to make sure—there’s technical constraints always, right? But we needed to make sure that we were accepting the technical constraints that we were comfortable with and that we were optimizing for the best user experience possible so that we would get the adoption and the value.

Brian: I’m always interested in, how do we bring design and user experience into places where the primary makers of the solutions—I will call them designers because you can’t not make a design choice; you can just make unintentional ones—so if that’s what your team is made of is primarily of data scientists and analysts, how and when do you know when to bring this in? And can you paint a clearer picture of what the difference is or what you’ve seen change as you’ve changed the way you do this, now? If you were to change jobs right now, go into a similar IOT space, just different company, you have a methodology that you use that’s not the old one.

Katy: Right.

Brian: What was it that changed that made you value that and how you just kind of approach it?

Katy: I hadn’t ever put it in those words, but I think I realized what you just said is there’s no such thing as, you know, not having a design choice; there’s just unintentional ones. I think the really key thing that is always true, it’s just I think everyone goes on their own path to discovering that it’s true, it’s that it is much faster to design and prototype and find out what you don’t know that you don’t know upfront than it is to develop a thing and find it out later. You might, early on, feel this, like, rush of yes, we’re moving faster because we’re developing faster. But actually, that’s a false sense of winning.

Brian: Race to the outputs.

Katy: Yes. Yeah.

Brian: [laugh].

Katy: And so, I think that’s really one of the really key lessons that I learned through experience. And if there is a way to teach people that lesson without them experiencing it themselves, then I would love to know what that is because I think we’d save people a lot of pain, but I wonder if people just kind of have to [laugh] go through their own journeys to learning it. But if you build and launch a product that doesn’t have success in the market, then I think you—that’s not what anybody wants. That’s not what the engineers who contributed to that product want. That’s not what the designers want. That’s not what product wants. That’s not what the company wants. That’s not what anybody wants.

You realize, oh, I could have figured out a better way before we started development. So, I really think it was that lesson learned of seeing, you know, I’m proud that we got the product to market, but it wasn’t successful. And I think that unsuccessful products come with some very important lessons so that we carry with us to the next one. That one was just as a scar [laugh], you know, that I learned from. So, I think that’s really what changed.

And I think that has led to a series of just how do I optimize to get even better outcomes? How do I make sure that users are going to adopt this, that it adds value for the users, and therefore that it adds value for the company because it’s being adopted and people are willing to pay money for it. But I hate to say it’s as simple as that because it’s a painful lesson to learn, but it really is just seeing the outcome that you don’t want to happen and realizing that you have the power to prevent that.

Brian: Is there a different way that we need to work when we bring user experience and design work into the data space that’s not like it is in, say, traditional software or something like that? Are there things designers need to learn from data science? Are there things data science and analytics people need to learn from design? Is the dance different or is the dance mostly the same at the highest level?

Katy: I think one really important lesson that I had to learn, and was a painful lesson to learn, and I see almost every data analyst I’ve ever talked through goes through a similar kind of painful lesson of, we have this data skill set and we want to apply these sophisticated techniques, and we want to deliver all of these pieces of value to our decision-makers—whether that’s a user, whether that’s an executive sponsor, whatever—we want to deliver this value to the decision-makers, these insights. We speak a language of data and we sometimes take for granted, like, that we think everyone speaks that language. But not everyone speaks that language. And so, one of my earliest career experiences was being told by my manager who wasn’t a data person, she was a marketing person, but had very sage advice for me that I’ve kept [laugh] with me throughout my career, I was responsible for communicating these insights to our executive stakeholders, and she would say, “You need to add a key takeaway to that slide.” And I would say, “But it’s redundant. The chart always already says that. Like, that’s clearly what the chart says. Why do I need to write it in English?”

Brian: Right. Write it in words? [laugh].

Katy: Yeah. And she tried to gently make me aware that that’s what the chart said to me, but that’s not what the chart said to everyone. I think there’s this language barrier that happens between data analysts and everybody else because we speak this language that we feel very fluent in and we’re like, “What is it that you’re not picking up on? It says it very clearly in this chart.” And that’s just not the way that everyone sees the world.

And I think that’s helpful and I think that’s a gift, but I think it can be a barrier to communication when you assume that everyone else can communicate the way that you communicate or see things the way that you see things. So, that’s the main thing I’m aware of, is that limitation on the data analyst side that I want more data analysts to be comfortable with what they often perceive to be dumbing it down, but I’ve come to not see it that way. It’s really just an effort to communicate more clearly with your audience. Because what does the insight matter if no one else grasped it, right? And the way to communicate it is not with being more precise with your numbers and adding an extra couple of fancy, you know, statistics. It’s just to communicate the output of your analysis more clearly to the person who needs to be able to make a decision.

And so, that intentionality with communicating, I think, is something that I learned over time which enabled me to work more successfully with that user experience designer later on in my career, who was able to translate what I was trying to communicate into something even better because I was able to be like, “I know this is the essence of what the decision maker needs to know. And since I can communicate that to you, can you help me translate it into a design that’s going to communicate that with the least effort possible for a user?” So, communication is what I’m going to boil that down to. Communication is difficult when you speak a language that you think everyone else speaks, but they don’t.

Brian: I think the challenge is even higher when we compare doing an ad hoc analysis on something, presenting a PowerPoint deck that’s probably been narrated and has plenty of space to, like, draw conclusions and then provide evidence, versus a self-service tool where there’s no human being telling you how it works. And so, we’re designing this for people that we may never meet one day and obviously, you can’t design for everyone equally. The challenge is even harder with self-service software tools doing this. Is that your perspective as well? To me the game is much harder. [laugh].

Katy: I agree. I have come to think that there are two main camps of data products—and I’m open to learning from experience to broaden this horizon as well—but my experience so far has taught me there are two main data products. There’s one that is designed for a more sophisticated data user that is really self-serve and I can make my own charts and graphs, and I can fully explore and click around and do what I need to do. But designing a tool that way in itself implies certain things about your user, which is that they understand how to use a business intelligence tool, and how to build charts and graphs, and how to interpret the data that lies within those charts and graphs. Those have an audience, in my experience, that needs to be targeted at a data user, at someone who has some level of sophisticated knowledge about how to interact with those things or is at least willing to put in the time to learn how to interact with one of those things and has more experienced knowledge about the data that’s being explored so that they can draw conclusions from that angle.

The other side, which is the side that I’ve ended up focusing on more as I progressed in my career because I found them to be more successful with a broader audience—but they take more work upfront—is that you have to define what decision is my user making on a repeated basis that is worth building something that it does automatically. And so, you say, “What are the questions that my user needs answers to on a repeated basis?” And then you probably have some filters or some slicing and dicing, based on their most common needs to see that, but at its essence, you’re answering three or four questions for that user. And they have to be the most important three or four questions for your user to add value. And that can be a difficult thing to derive with confidence.

And I think that’s the other reason that embracing the prototyping early on has been really important. Because if you spend six months building a product that automates the wrong three or four questions, and you don’t realize it till you hit the market, then everything about your underlying product might need to change, the way that you’ve designed your data pipelines might not facilitate answering the other three or four questions that you’ve now come to realize. And it’s not that that’s an insurmountable problem, but it’s certainly a problem that you don’t want to be dealing with post-launch.

Brian: Yeah, yeah. Well, not to mention just the technical baggage, the emotional—

Katy: Yes.

Brian: Baggage of, “Are you kidding me? We just—”

Katy: Yes.

Brian: “—like, spent all this time, and now we’re going to undo it?” Like [laugh].

Katy: And now there’s a lack of confidence in the organization. “Well, how do we know you’re right this time? Why should we build this one? Why won’t it just be the same story?” So, you know, building that conviction early on, that you’re solving the right problem, that you’re answering the most important question is really key for me.

Brian: A challenge I know, for a lot of data product leaders is, a lot of times they’re asked for something, and the need is expressed as a solution. “I need a model that will do X.” They’ve already assumed that they need a machine learning model and they already assume that they know what telemetry or data they need to answer the question, but the question itself has not actually been stated yet. They just gave an answer. And then someone sees that as, “If I just give them exactly what they asked for them, they should use it, right?” And you probably know, that’s not how it always works.

And we have to dig, we have to surface this and you said it’s not always easy to figure out what the three or four things are. Is this where research comes in? Or tell me how do you get to the confidence level that we know that we’ve now figured out what those magic three or four things are? And if our data product answers these things routinely, we’re probably going to have a win, or at least where we’ve really de-risked the chance that we’re going to build the wrong thing here. Is there a signals you feel or know, this spidey sense that goes off when you’ve hit on that? Or is there a method you use to make sure, I never take anything with a grain of salt? I’m going to dig, dig, dig, dig, and then when I hit X, then I’ll be convinced that we should make that.

Katy: I don’t have any hard thresholds. And I think that’s a hard thing is that I have, you know, along with data analysis, it was market research. Those were sort of coupled for me early on in my career. So, like market research being a part of it, I’ve developed a bit of intuition to the way that I navigate this, and that’s made it difficult to really prescribe to someone else, kind of like, what are the things that make it successful when I do it.

I will say that I’m a big fan of triangulating. So, I’m a big fan of exploratory, qualitative interviews. I’m a big fan of NIHITO—Nothing Important Happens In The Office—and going out to a customer’s location to observe. I think there’s so much context that you can pick up on the way that the team is talking to each other in their office, especially in a B2B context, right? How do they work together, that gets at a lot of the workflow stuff that’s really difficult to just ask questions about. Like, you can go and see that, like, oh, this piece of paper gets transferred from this person’s desk to that person’s desk and that never would have come up in an interview, right? Just, like, little things like that.

Brian: Or they’re using paper? Like [laugh] like—

Katy: So, much more common than you might expect.

Brian: Right.

Katy: Like, so many of—

Brian: Wait, there’s a fax machine involved here? [laugh].

Katy: Yes. Yeah. Yeah.

Brian: You did say car dealership. There’s probably a fax machine involved. [laugh].

Katy: There’s so much paper in the world still. You know, I’m a big fan of those things. I’m a big fan of going out and having qualitative exploratory interviews that get at, like, what’s the most strategic problem you need to solve? If they bring a solution to me, which is indeed common, I try to dig at, like, “Okay, what are you going to do with that? How are you going to engage with that solution? How does it change your behavior? How does it change what you do next?”

Because that’s really the output that they’re looking for is the way that it changes what happens next. And so, that’s been a helpful way to approach that specific topic. But I am a fan of the qualitative interviews, but I’m also a fan of triangulating. So, I don’t just do interviews, there’s also industry trends or trade shows, or, like, subject matter expertise about, like, what’s going on in this industry that will tend to triangulate with the top strategic problems that your customers are telling you about. They don’t always fit as, like, a one-to-one thing; they’re not going to be exact mirror images of each other and I think that’s positive because if they’re not exact, if you’re finding a little bit of a space between where your company can differentiate itself, then that’s actually ideal.

But you should start to find some, like, puzzle pieces that fit together and your triangulation, right? So, if you have three or four different pieces of information, competitive, industry, customer, technology, that are pointing in the same cohesive direction, and you can find a differentiated value-add there, then that’s really the ideal space to be. So, I don’t put all my eggs in one basket, but also because of that, if I have six customer conversations, and I’m hearing basically the same thing, and I’ve got those other triangulating pieces of information, then I feel more confident moving forward with those six conversations because it’s not actually really just those six conversations. And then the development process itself gives you opportunities to continue to challenge your own beliefs and tweak and adjust as you go, right? And I think that iterative feedback loop in decision-making has also been really key.

Brian: It sounds so fluffy and, like, vague about what you get for doing—“I went out and talked to somebody.” Is there something that’s surfaced in your automotive product or one of these other products as a learning that came out of going into the field and doing this kind of research or conversations where it’s like, we never would have put this in until we found out X, and it’s just not going to come out—it’s not going to—they’re not going to write that down because it’s just their lived experience and they’re not omniscient about their own life. The paper being passed, or like, “Wait a second. What happened? You carried a paper?” “Yeah, I always do that, every morning at nine.” Like, “What?” You know? Is there something like that where it became something in a solution or just an insight that recently sticks out that came from doing that kind of work?

Katy: I don’t have a precise example like that, but I think the example that I want to share that’s tangential is my first data product—that wasn’t all that successful in the market, as we’ve discussed—I was very proud of the methodology, which is that we figured out what decisions the users needed to make, and it was this top-line outcome that they were trying to improve. And then we applied what I started calling a driver model, what’s driving that output so that you can do some data analysis to figure that out and fix the inputs so that you get a better output. It was just a regression model; we just looked at all the data, we ran a regression model, here’s the features, they feed into the output. Awesome.

And so, I was proud of that methodology. I thought that was pretty cool, and when I went on to a different company to build a data product there, I thought, “Yeah, that’s basically my framework for how I think we should help users make decisions with data. As we figure out the output they’re looking at, we evaluate the data to find the inputs that will lead to that. How do they optimize, and that’s the answers that we give them.” And so, when I joined this company, that was basically my framework for what I was going to go and execute for them. I needed to execute quickly. That was my framework. Awesome. That’s what I think I’m going to come in and do. That’s what I think I know how to come in and do.

This was a real estate company, so I went out to real estate brokerages and I talked to a handful of real estate brokerages in their offices and asked them, like, “What do you really need to know?” Like, “What do you wish you knew? What would change the outcome for your brokerage?” The answers that I got were so much simpler than what I was going to go out and try to build. And I think that’s also an example of, like, the data analysts wanting to use all these su—I want to use a regression model.

Why can’t I use a regression model? What you’re asking for is basically just, you know, addition and subtraction. I really wanted to do something fancier for you. But the problem that they needed solved did not require a driver model, did not require anything fancy. It just required being more aware of some of the financial happenings in their brokerage and being able to analyze that on a couple of key filters so that they could spot, like, which agents needed coaching, for instance. Which agents need coaching, which agents should I be retaining because they’re contributing most of the bottom-line revenue for my company?

It was a simple problem, and I would not have figured that out if I had approached the conversations in a different way or if I hadn’t gone out to talk to them, if I had just been, like, yeah, we’re going to try to help them and improve revenue, obviously. So, what’s driving revenue? Let’s go look at the data. Like, no, I needed to talk to the users and find out that what they needed was much simpler than a regression model. They just needed visibility so that they could drive their coaching and retention behaviors. And that’s ultimately what the tool became was a facilitator of coaching and retention of agents. [laugh].

Brian: So often, there is something like that, where it’s an insight that again, it comes out of something you weren’t—you didn’t go there to ask about that at all, but it surfaces as a more bleeding pain as something that would be a game changer. And I think it can be hard if we think, like, but my identity is wrapped up in the fact I know what a regression model is and how to make one, and so I need to deploy this hammer on this job because that’s what I do. I went to school it, whatever. Like, my identity is wrapped up in that. And I think product people have to be really good at letting go of some of this, like, well, we have all these skills, but, like, my job is to deliver an outcome for the users or the business stakeholders that we serve, whatever the tooling required is, right?

And at some point, you might need something like that, but that might be about getting from, like, 82 to 90% accurate or some lift of quality improvement, but not about that zero-to-one or even just knowing are we even in the same solar system here about what we’re trying to do. I could drill your brain and ask you a million other questions, but I want to respect your time here. Thank you for coming on here. Do you have any closing thoughts or advice for people trying to get into data product management, maybe things that they might need to let go of if they’re coming out of the data science or analyst role? Like, there are also leaders listening to the show that I think it’d be a hard sell for me to go get headcount to hire this role that I find hard to define, so I might have to start with upskilling or changing the people I have. You know, John or Mary kind of shows some affinity for this quote, “Stuff.” What are the things that they need to be aware of, skills, any of that you might leave us with?

Katy: I am a bit biased because of my background, but I think that finding people in your organization who have the appetite to be cross-functionally educated, particularly in a data arena is very important. So, for instance, a lot of data people, like I mentioned, I know have communication hangups because they communicate in a certain way and it’s hard to accept that everybody else doesn’t. And so, taking someone with a data background and giving them a little bit more exposure to user experience or product management, right, but just how to connect with the problems that the user is trying to solve on a deeper level, I think can be extremely impactful. And vice versa. I think you’ve got a product manager or user experience designer who’s trying to figure out how to design for data products, then you need to get them some data analytics training so that they can understand on a deeper level what their data analysts are trying to communicate with them about what needs to be articulated to the user. And neither one needs to become experts in the other person’s field, but again, there’s this communication gap, I think, between the disciplines sometimes, and the cross-pollination is intended, and the reason I would recommend that is that I think it can help close some of those communication gaps.

Brian: Great advice. It’s super great. Thank you so much for coming on the show. But, Katy, before I let you go, where can people follow you? I know we connected on LinkedIn. I’m sure we might have some questions from listeners. If they can reach out to you, is there a good place to do that?

Katy: LinkedIn probably is the best place. I’m fairly active on LinkedIn. The community is growing on LinkedIn, and I’d be excited to engage with people there.

Brian: Cool. Well, again, this has been Katy Pusch, Senior Director of Product and Integration at Cox2M. Thank you for coming on Experiencing Data.

Katy: Thank you. This was so much fun. I appreciate it.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe for Podcast Updates

Join my DFA Insights mailing list to get weekly insights on creating human-centered data products, special offers on my training courses and seminars, and one-page briefs about each new episode of #ExperiencingData.