Registration Closes Oct. 3rd for My Final 10-Week Training Seminar of 2021
Designing Human-Centered Data Products is back, but space is limited. Work with me and a small group of data product leaders who want to learn to build more useful, usable, and indispensable analytics and ML solutions. Live sessions begin Oct. 4, 2021. Details/register

072 – How to Get Stakeholders to Reveal What They Really Need From a Data Product with Cindy Dishmey Montgomery

Experiencing Data with Brian T. O'Neill
Experiencing Data with Brian T. O'Neill
072 - How to Get Stakeholders to Reveal What They Really Need From a Data Product with Cindy Dishmey Montgomery
/

Episode Description

How do you extract the real, unarticulated needs from a stakeholder or user who comes to you asking for AI, a specific app feature, or a dashboard? 

 On this episode of Experiencing Data, Cindy Dishmey Montgomery, Head of Data Strategy for Global Real Assets at Morgan Stanley, was gracious enough to let me put her on the spot and simulate a conversation between a data product leader and customer.

 I played the customer, and she did a great job helping me think differently about what I was asking her to produce for me — so that I would be getting an outcome in the end, and not just an output. We didn’t practice or plan this exercise, it just happened — and she handled it like a pro! I wasn’t surprised; her product and user-first approach told me that she had a lot to share with you, and indeed she did!  

A computer scientist by training, Cindy has worked in data, analytics and BI roles at other major companies, such as Revantage, a Blackstone real estate portfolio company, and Goldman Sachs. Cindy was also named one of the 2021 Notable Women on Wall Street by Crain’s New York Business.

Cindy and I also talked about the “T” framework she uses to achieve high-level business goals, as well as the importance for data teams to build trust with end-users.

In total, we covered:

  • Bringing product management strategies to the creation of data products to build adoption and drive value. (0:56)
  • Why the first data hire when building an internal data product should be a senior leader who is comfortable with pushing back. (3:54)
  • The "T" Framework: How Cindy, as Head of Data Strategy, Global Real Assets at Morgan Stanley, works to achieve high-level business goals. (8:48)
  • How building trust with internal stakeholders by creating valuable and smaller data products is key to eventually working on bigger data projects. (12:38)
  • How data's role in business is still not fully understood. (18:17)
  • The importance for data teams to understand a stakeholder's business problem and also design a data product solution in collaboration with them. (24:13)
  • 'Where's the why': Cindy and Brian roleplay as a data product manager and a customer, respectively, and simulate how to successfully identify a customer’s problem and also open them up to new solutions. (28:01)
  • The benefits of a data product management role — and why 'everyone should understand product.' (33:49)

Quotes from Today’s Episode

“There’s just so many good constructs in the product management world that we have not yet really brought very close to the data world. We tend to start with the skill sets, and the tools, and the ML/AI … all the buzzwords. [...]But brass tacks: when you have a happy set of consumers of your data products, you’re creating real value.” - Cindy Dishmey Montgomery (1:55)

“The path to value lies through adoption and adoption lies through giving people something that actually helps them do their work, which means you need to understand what the problem space is, and that may not be written down anywhere because they’re voicing the need as a solution.” - Brian O’Neill (@rhythmspice) (4:07)

“I think our data community tends to over-promise and under-deliver as a way to get the interest, which it’s actually quite successful when you have this notion of, ‘If you build AI, profit will come.’ But that is a really, really hard promise to make and keep.” - Cindy Dishmey Montgomery (12:14)

“[Creating a data product for a stakeholder is] definitely something where you have to be close to the business problem and design it together. … The struggle is making sure organizations know when the right time and what the right first hire is to start that process.” - Cindy Dishmey Montgomery (23:58)

“The temporal aspect of design is something that’s often missing. We talk a lot about the artifacts: the Excel sheet, the dashboard, the thing, and not always about when the thing is used.” - Brian O’Neill (@rhythmspice) (27:27)

“Everyone should understand product. And even just creating the language of product is very helpful in creating a center of gravity for everyone. It’s where we invest time, it’s how it’s meant to connect to a certain piece of value in the business strategy. It’s a really great forcing mechanism to create an environment where everyone thinks in terms of value. And the thing that helps us get to value, that’s the data product.” - Cindy Dishmey Montgomery (34:22)

Links

LinkedIn: https://www.linkedin.com/in/cindy-dishmey/

Transcript

Brian: Welcome back to Experiencing Data. This is Brian T. O’Neill, and today I have the Head of Data Strategy, Global Real Assets at Morgan Stanley on the line; that’s Cindy Dishmey Montgomery. Welcome, Cindy, how are you?

Cindy: I’m doing well. How are you?

Brian: I’m doing great. And why don’t I have you on the show? Well, I—at some point, you had joined my mailing list and you had replied to the little welcome autoresponder email that goes out, and you had said, “I think this work’s important that we’re doing,” and you said, “I’ve developed my data and analytics teams to focus on design and product, including the data engineers and data scientists.” And I was like, “Yay.” And then I was like, “Well, what does that mean to you? And what exactly is going on over there?” So, that’s what I want to hear about.

Cindy: Awesome. Awesome. Yes, I am super excited to have this conversation because whoever hears it, I feel like you’re creating this community of people who will be product-centric with all things data, and I think that’s where we need to go. So, what does that mean? That means—my role, if I describe it in very simple terms, I’m here to monetize and commercialize data in whatever form.

So, in order to do that, you really need to start with, “Where’s value?” And there’s just so many good constructs in the product management world that we have not yet really brought very close to the data world. We tend to start with the skill sets, and the tools, and the ML/AI, you know, all the buzzwords. And they’re important, and I love them, and I wouldn’t be a proper data geek if I wasn’t extolling the virtues of all these data concepts. But brass tacks: when you have a happy set of consumers of your data products, you’re creating real value.

So, I focus on refining whatever methodologies, or techniques, or leadership methods, or just bread-and-butter designing of tools and processes to create value.

Brian: What’s the ‘not’ way of that so that I can understand what the—you called it, like, there’s a lot from product management, from—I assume you mean software product management—that we can bring into the data product world. What’s the ‘not’ version of that? Like, if you’re not doing it that way, what does that look like? And tell me how it—because I want to understand the difference between what you’re doing and what you—and, probably, what I perceive a lot of other organizations doing as well.

Cindy: If you’re talking a business case with a tool or a technique, that’s the ‘not’ way. If you’re starting with, “We need a data lake,” and that’s the business case and that’s how you’re evaluating success—so you’re evaluating success from the activity of creating a data lake or a data warehouse, or I created a predictive analytic check—if you as a data person are selling that business case and then you’re expecting everyone to be happy with the fact that you checked the box on that technique, you’re creating a scenario for both sides to be very unhappy with what happens.

Brian: How do you get to the point where—I often tell my seminar students, we’re not order fulfillment. This is not McDonald’s, and they order what they want, and you hand it through the drive-thru window, and off they go. Like, the path to value lies through adoption and adoption lies through giving people something that actually helps them do their work, which means you need to understand what the problem space is, and that may not be written down anywhere because they’re voicing the need as a solution. Like, I need a cast for my arm. No doctor would just give you a cast and not ask any questions about what’s wrong.

That’s not how it works. So, talk to me about how your team, with this kind of minds that you have, how do they unpack what’s actually needed and get out of the implementation-tactic-first approach and get into the problem space approach first? How do they do that? What’s involved?

Cindy: I love that analogy of the, you don’t walk into a doctor’s office asking for a cast. Although some people may, you know, go and they look on WebMD and they come to the doctor and say, “I want that.” You know, like, “Nope, nope, nope. Back up.” And it—

Brian: Well, they may be right. Some of the time they may be right, but we have to have a conversation to know because they are self-diagnosing. We are probably not—like, I’m not a arm specialist or bone specialist, so I don’t know if that’s the right solution. Maybe sometimes they are, but I think a conversation is required, possibly some research, to really understand if that’s the right technique. And is it the right thing right now?

Maybe it should be AI over the next two years, but next quarter, does it need to be that? No, maybe a simpler analytic approach might be more—just a simple dashboard or a tool or something would be fine. How do you guys unpack that, and how do you have that conversation?

Cindy: Yeah I find it… what you’re saying is absolutely true. You have to have that conversation. I think that the ten steps leading up to having that conversation are also quite critical. Because it’s usually someone from a business perspective—and I say business in the sense because I do consider data a business function. Let’s say I’m in a marketing department, and you realize, “I can’t make sense of the channels we’re spending marketing dollars are actually giving us the right return. Are they really helping our sales funnel?”

Let’s say someone identifies that problem and they realize it’s a really large problem, and we need a specialist. So, they normally go out, either seek a consultant or read blogs or something and say, “Aha. I need a data scientist.” Now, here’s where this can go a bit wrong. I think that whoever is the first person in the data role needs to be senior enough to feel comfortable with pushback.

Now, I’ve seen people early in their career just be completely value-centered and have no problem saying, “Wait a minute, wait a minute. Maybe I don’t need to use this tool.” But it’s quite a human element to feel uncomfortable; if you’re the first person in a data role, you have the burden of being the first person to get something right. And if they’re very clear, with, “You know what? I need a dashboard that does this.”

I find folks that are fairly early in their careers will say, “Okay, I’m going to build this dashboard.” And if it’s not the right thing, it creates a dynamic where both sides are very—you know, the person who created it is demoralized, and the person who asked for it realized that wasn’t quite it, and you know, “You didn’t get it right, so we’re all unhappy.” So, a bit of a theory I have is, your first hire should be a more senior one. And they may not be the all-singing, all-dancing, they can do TensorFlow and Tableau and all these things, but they can actually help identify what the right problem is and the right solution for it and then coordinate who needs to get in there. Now, I say that with degree of even skepticism in my own advice [laugh] because it’s more about personality, and probably it’s a bit an uncomfortable way of starting the conversation because it’s not a technical skillset.

It’s definitely a soft skill set that either you develop over time, or you just have organically. And I think that once you have the right two people: the person with the problem, the person that knows how to elicit what the actual—like, do the ‘Five Whys’ or whatever technique to figure out why did you need that? Then there’s a lot of comfort with, “Okay, I don’t have to oversell doing a regression analysis or building a dashboard. Maybe all we need to do is a one-time analysis, and then figure it out from there.”

Brian: Is this way of thinking tied to strategic work? And does this mean that part of your role is to figure out what the strategy is on all these, kind of, various projects? Or is your role and strategy more about an overarching vision that’s always evolving, but it’s setting a general path for the ship to travel towards? How does that play into this?

Cindy: It’s kind of a T. The top of the T is, I spend my time with, let’s say, senior business leadership, understanding their business goals in business terms only. What does growth look like? Who are our customers? What is success for them?

What is the footprint of the organization? I’m very big on if you’re coming into an organization, you have to know what the departments are. How do they work together? What’s the value chain from the moment we say, “We want to have this business goal,” to the moment that value is created for our end customers. And for the company as well.

So, once you can map all that out, then the long part of the T is what skills and resources do I need to execute on that? I firmly believe there’s only ever two things that can go wrong in any data initiative: not having the right executive alignment on what success is in business terms, and not having the right skill at the right time. And that goes for the tools and the techniques as well. So, sometimes when you start, the skill sets you need is just about getting good data quality: having the data and understanding where the issues are and giving people time back because they’re spending time manually trying to get the data right, and not actually analyzing the data and finding new insights in it. That may be where the first step to value is.

And then if that’s the first step, then the next successive step should still be very much in line with how do you help each department that is in the value chain create whatever incremental value they need to create for end-customer. In the banking world, it’ll be an investor or maybe an institutional client. But the same applies; I don’t think these techniques are unique to my world. But I think the challenge of being comfortable that you have to be uncomfortable for a minute, right? Kind of step out of your comfort zone, and if you come from more, let’s say, like, a data governance role, and really comfortable with, all right, we need to write our policies and procedures, or if you’re a data scientist, and you’re all about, all right we need to adopt Python notebooks, you have to set that a little bit of aside for a moment and be humble and say, “I don’t understand how your business works. Let’s start there. Can you teach me on that?”

And then you ask the ‘Five Whys’ everywhere you go, to say, “Okay, before you start prescribing to me,”—though they will always prescribe because everyone gets excited about, “All right, we’re going to fix this problem. Can I get this?” You can compare the value that you believe they need in their own words, to where we start, and you can have that conversation. And then you can manage expectations, which is great. So, you can under-promise, over-deliver, all of those techniques.

But it’s really hard when you’re—I think our data community tends to over-promise and under-deliver as a way to get the interest, which it’s actually quite successful when you have this notion of if you build AI, profit will come. But that is a really, really hard promise to make and keep.

Brian: Yeah. I interpreted—and this might be wrong—I interpreted that you’re kind of slightly under the covers, just hitting base hit after base hit after base hit and you’re not going for grand slams out the gate every single time you come up. Not to say that a base hit isn’t good, but you’re consistently not striking out; you’d rather play it—not safe, but I’m looking to have an inc—and the analogy is coming from you said incremental value, so I interpreted that as being, let’s show some value in most of the work that we’re doing. Always try, but let’s have a high consistency of some value that’s measurable because it feeds us doing the next thing and the next thing and the next thing. Is that a safe thing or is that a misread?

Cindy: It’s the first part. Let’s say. It’s not a must-read, but it’s the first part. Because if you want to go for—and I’m really not great with sports analogies, so I hope I get this right—so if you want to swing for the fences—that’s a baseball—

Brian: Yeah.

Cindy: —analogy, right?

Brian: That works. Yep, yep.

Cindy: If you want to swing for the fences, you have to develop two things: trust, and a track record. Well, that’s still part of trust. Trust and a track record, and then, two, you need to really understand the business well enough to swing for the fences. So yes, I am very risk-averse, if that’s what you’re alluding to, and I think people should be. It may not be in the first 6 months, it may not be in the first 12 months, may not be in the first 18 months, or it could be earlier, but after you get these hits and these wins, you’re learning how to win in this environment. Like, if you know what a win looks like, and both sides know what a win looks like, then you can make bigger and bigger wins.

Brian: Yeah. And I didn’t mean that as risk-averse. I meant it as focusing on consistently delivering value instead of these, kind of Hail Mary projects that so often do not work out. You put all your eggs in there, and I’m wondering, is it—do you see it as a portfolio? It’s like, “Yeah, I have a couple grand slam plays that we’re working on, and these are two three-year projects, and then I have nine Tableau dashboard projects, and like, not super crazy, but those will provide us with good leverage and measurable value.” And something along those lines was kind of the—

Cindy: Perfect.

Brian: —vibe that I got.

Cindy: Yeah. No, totally. And I think that it creates such a positive, productive, fulfilling environment for everyone. And then value to boot, right? So, your smaller wins, maybe they’re not as sexy, they’re not, you know… not something you would cry home about, but you make someone’s life better. And that creates a lot of goodwill, that creates a lot of ideas. You just freed up the subject matter expert to think more with you, and they’re now your friends.

Brian: You just said, “Make someone feel good.” That sounds really mushy and emotional to—

Cindy: [laugh].

Brian: —some data people. But I’m serious. I think it’s like, there’s this allergic reaction, like, I’m not supposed to care about that, I’m supposed to build this technical thing and then whether you want to use it or not, that’s kind of your business. And I very much think it is about, yeah, it’s making accounting’s job easier, and it’s making marketing people feel like, “I know how I spent my budget, I know the returns I got. And I know where to spend next quarter’s budget because I have this insight that I didn’t have,” and you’re literally making that person look better, feel better to their boss.

And if you take that approach, it really is a win, but I don’t think that’s how a lot of people see it. And do you think it really does come down to, like, that level of empathy of just caring about people, and we’re kind of like in this service—we’re, like, in a hotel or something. We’re here to serve, you know?

Cindy: Absolutely. Yeah, it is a dirty word: service? But it is a service role. It’s a service business role, which I think is the best of both worlds, in my view because you get the benefit of being technical and still learning all the cool techniques, but you also get to be a party to value being created. So, let’s say, yes, you’re working with accounting, and you’ve spent months with them just getting their data right so they can close the books faster; let’s say that was the initial goal.

But in that process, you understand how their account structures work, you understand how they budget better, and then you start doing this analysis and blending other data sets, and you realize, “You know, this particular line item, we keep missing budget by a pretty large margin and I’m realizing the driver to that is this thing no one was thinking about? Would you like to continue, like, a bigger project to see if we can get better tracking to budget, based off of this information?” The answer is going to be yes. After you’ve spent the past couple of months helping them close the books faster versus promising that you can find something that they wouldn’t otherwise believe you could because obviously, you don’t know their business as well, you don’t know the accounting structures as well. You just don’t have that insight for them to trust you to take that big risk.

It is a very safe way to getting to the point where you can have these much bigger wins in the portfolio. And yes, I truly believe the sum of all parts is greater than the whole, so yeah, maybe the split is nine to two [laugh]; nine small things that are [unintelligible 00:18:06] to a pretty big goal, and then two massive, you know, this might be whitepaper-worthy they type of projects.

Brian: In my experience, especially with enterprise software applications and we’re building large-scale products, the strategy becomes really, really important, and it can’t be mush. It needs to be what I call design-actionable for a product team to actually take. It can’t be too detailed, but it needs to be enough that we all can envision our own finished thing that supports the overall strategy, even if it’s all different things because the strategy is not supposed to explain the execution of it. When this doesn’t exist, is it the job of the service people—so our data product teams and data scientists, or designers and engineers, is it partly their job to help figure out what that should be with the stakeholder who says, “We need some AI, please. Cindy, can I have some?”

When you get these vague requirements, and, “We need a predictive model.” They’re prescribing an answer to the problem in it. And when it’s not clear, do they need to go back to the drawing board, or do we need to go back to the drawing board? Do you think we, the data people, have a strong role in helping figure out what the problem space is? And maybe even telling them, look, that’s going to be really hard.

We probably could do that; it would take years, but now that you’ve told me about this accounting thing, you have this giant hole over here, which we’ve done ten times. Should we work on that instead? Is that how you see it? Like, we’re there to also figure out the strategy with them, instead of waiting for it to be handed to us, perfectly well defined and, you know [laugh]. I don’t think problems come in well-defined—

Cindy: Yeah, they don’t. I was just—

Brian: —most of the time.

Cindy: —thinking [laugh].

Brian: They just don’t. To the people that need to make them, they’re just not—it’s just not. I hate it when you hire someone to work on your home and they don’t ask you any questions, and you’re kind of like, “I don’t know toilets. I mean, I just know it’s rocking, and it’s leaking here.” But you don’t want them to start hammering right away.

It’s like, “Well, how long has it been going on? And have you ever seen this”—whatever? And you kind of almost want them to ask some questions to make sure that you know that what you don’t know about. [laugh]. And I don’t think that always happens. “Well, that’s what they said they wanted. They wanted a model that did this and it had to be 72% accurate, and I don’t know what they’re going to do with it, but it’s there on the shelf. When someone’s ready to pull it down and use it, it’s there.” Like, “No.” [laugh].

Cindy: Well, here’s my theory. And I don’t know if it’s true or maybe it’s just my bias because I come from a software engineering background. Computer science by trade; I did software for a while before migrating to data. I think when, you know well before our times when technology became a division in enterprises, period, I can only imagine what that process was like. It was probably the same kind of thing, and then eventually, we figured it out.

But what I think was interesting about software, it is very requirements-based. So, many methodologies have come and/or gone, you know, be it waterfall, you know, all of those things that say, “There is the business requirements and then there is the technical requirements. They need to communicate with one another.” But the scope of what happens in, let’s say an app, is very finite. It’s not very dynamic or you’d have really bad software.

So, if I push this button and it tells me what today’s date is, you can create a nice requirement for that, from an apps perspective. Or a payments platform. Like, the role—like, what a payments platform does should be fairly static, [laugh] right? You should do exactly what it is. But data in its introduction into—and I’m using the word enterprise; like company, whatever—we haven’t figured out where it fits.

I still see—and this is my personal belief—data roles, rolling into CTOs or technology, the nature of what we’re talking about right now naturally lends itself to not being in technology because it’s not like you can give the data org a finite set of requirements and say, “Success is that.” Because it’s not. It’s a conversation on how to get to the business goal. And the success is the business goal. It’s the data person’s role to figure out how to bridge the gap between the data and that business goal, in partnership with the person who’s actually ultimately accountable—as well—to it.

So, you’re partly accountable to that business goal. So yes, the process of designing it, I think this is still very new. Everyone’s still trying to figure it out, but where I see it working the best is when you start from the business case, you have lots of conversation with the business subject matter expert—you know, pick a department: operations, sales, marketing, finance—and you’re working with them to design what the solution is. And the solution has no—it shouldn’t be biased by, “It must be a dashboard. It must be a report. It must look like this.” The solution could be, we just need an answer to this question.

Let’s go find it. Or it could be, we actually need a whole software suite to manage this. But that’s the role of the data person. Like, again, I [sigh] I find difficult because I think the technology side benefits from decades of people knowing, “Okay, we exactly what the role of someone in a tech or an IT organization does.” I still see job descriptions that aren’t very consistent, let’s say, with getting a data role in that can do that well.

So, I think it’s definitely something where you have to be close to the business problem and design it together. It’s just the struggle is making sure organizations know when the right time and what the right first hire is to start that process.

Brian: When you talk about design, how does that work? Are your data people going out and working in small groups with stakeholders and sketching stuff? Are they prototyping and getting feedback? What’s your definition of design in this context, and what are those activities look like?

Cindy: Yeah. Typically, when—let’s just say, when mature because I like starting things from scratch and sometimes they’re just rough and tumble, and then we get our processes in place and we’re humming. So, when we’re a bit more mature, what does it look like? We have a map, a taxonomy if you will, of what are all the organizations, so that we’re speaking the language of that value chain that I mentioned. So, if you know well—you know, using marketing example, again—this person in the marketing department, they’re responsible for allocating marketing dollars and making sure that we’re priming—let’s say, in the real estate world, if you’re allocating marketing dollars to multiple locations to find tenants for an apartment, let’s say, if you see one channel is giving you ten applications a month, and another one is giving you two, well I should do some analysis to make sure that there’s something wrong with the application or the posting on that channel.

Or maybe it’s not a great channel. So, you know that person is accountable to that value, so when you think back, “Okay, what is our goal, and can we quantify it?” Well, our goal is to make sure that they’re getting as many high-quality tenants as possible. That’s where we start. From there, the brainstorming of the avenues by which we can look at the data that we have available to us, first, to see what we can do to improve.

Which maybe the analysis starts with that discovery phase of, do we have the data to analyze if we’re actually getting a good apples-to-apples comparison? Are we getting the same data points from each channel to actually tell if one is better than the other? And then from there, we look at, “Okay, well, let’s say we find the winner. Let’s say we find, you know what? We really think this has to do with the quality of the channel.”

How do we let the person that’s now accountable to that value creation, how do we give them the right something at the right time of when they need to make that decision? So, we’ll have a conversation with them. And look, they make this decision once a month, not once a week, not once a day. All right, so we need something that person can use once a month, it needs to give them a perspective on what we’ve been evaluating as a high-quality channel that we’ve evaluated with them. Maybe it’s just a report that sends them an email and says, “Hey, by the way, Bob, these are your winners and losers for the month. Right before you make that decision on where to allocate your dollars, take this into consideration.”

Brian: Yeah, yeah, I love that you said it’s right information, but right time. And that’s constant; the temporal aspect of design is something that’s often missing. We talk a lot about the artifacts: the Excel sheet, the dashboard, the thing, and not always about when the thing is used. And is there a data collection period? Is there a honeymoon period where it doesn’t work right for a while?

Or whatever it may—there’s a lot of different variables. Is it learning from last quarter? Does it need to update on a batch cycle? And when do they care? And this when part is also really important within the experience, I think, to think about, so I like that you’re looking at that.

But I wanted to bounce back to the beginning of this conversation. It’s funny because I was literally going to give you a hypothetical example about marketing spend. So, let’s roleplay for a second. Like, you’re in your role, and I’m the CMO or something, and I say, “Hey, Cindy, I need a machine learning algorithm that will help me know where to spend my dollars next quarter. How long will it take for you to build me some ML? Because that’s what everyone else is doing?” What do you start with?

Cindy: It was I need some ML to help me with my—

Brian: To understand where to spend my dollars. How do we perform where else? And I want to be able to predict the right buckets to spend my money on TV, radio, internet ads, whatever. How do I know how to allocate my money? So, can you build me an algorithm that will tell me that?

Cindy: Right. So, can you tell me what is success for you when this is all said and done? What does it look like? What more do you have at the end of it?

Brian: Well, I want to put as much money as possible into the right channels and as little into the wrong channels as possible.

Cindy: And how do you know you’ve put it into the wrong channels?

Brian: Low clickthrough rate, and if there’s some type of opt-in or something like that, we see a low level of—in the funnel somewhere we track—I don’t know—clicks, we track clicks and signups or something, mailing list sign ups or something like that, let’s say.

Cindy: Okay. So, you’re saying that you want to improve the amount of subscriptions to our mailing service as a result of this? Is that what you’re measuring as success?

Brian: Yes because in this case, all of our ads go to a place where they have to put an email address in to get more information. So yeah, you can say that. That’s true. All routes eventually lead there.

Cindy: What if—I mean, would you be open to the notion that maybe there’s something else that may be driving click-through rates not being high enough with your current allocation?

Brian: Such as?

Cindy: What if maybe the information that we’re sending in these channels, what if there’s variety in it? Maybe we’re not exactly sending consistent information across that may be impacting clickthrough rates? Would you want to see that before we make [crosstalk 00:30:16]—

Brian: I don’t care. If you could help me raise my subscriptions, that’s fine. Whatever your magic data science people do, that sounds good. What would you suggest?

Cindy: Wonderful. How about this? Well, let me take away this information, I need all the current data sets that your team is currently looking at, give us a week to analyze it and then I’ll come back with a proposal on how we can improve your clickthrough rates.

Brian: Awesome. That sounds great.

Cindy: Notice how I kept, like, where’s the why. Let’s focus on clickthrough rates, clickthrough rates, clickthrough rates. So, that leading with the solution, that really finding what the problem is, we’ve peeled the onion. And then opening the door to would you be open to other ways of doing this? Sure. Right? So—it’s important to ask question—

Brian: I never mentioned clickthrough rates in that—in my question to you, right? I just mentioned machine learning and then I want to increase—I want to spend money the right way. But that’s not maybe exactly what I wanted; I really just wanted to get more signups and spend it correctly here. And maybe it’s taking away something instead of adding something, maybe it’s [laugh] change the creative, maybe it’s—we don’t know what it is, but that’s the point is let’s isolate the goal. And just for the listeners, we didn’t plan to do this. I didn’t tell you—.

Cindy: No, no. [laugh].

Brian: —I was going to make you do that, but I figured we would arrive—you would interrogate me really efficiently from our conversation, and this is the kind of conversations that I’m always hoping teams are having more is to really get at that root problem and opening the doors to other things, and not just—it’s not McDonald’s, right? It’s [laugh] it’s more of the place where there’s no menu, and it’s like, “What do you guys like to drink?” And some of the cocktail bars in New York used to be this way: there’s no menu, and they just they talk to you about what you like, and they’re like, “Okay, I have a couple options. This one’s more fruity; this one’s more bitter. How does that sound?” And, “Yeah, let’s try that.” And now you’re onto something new, and it’s totally different experience than it is, like—[laugh].

Cindy: Absolutely. And—

Brian: Give me the model and [laugh]—

Cindy: You know, it’s interesting, though, sometimes—not all the times—but these conversations, the person was thinking, “All right, this is going to be huge, it’s going to take a long time, let’s just get ahead of it, let’s just do it.” And the solution after you do the due diligence may be very simple. So, you’ve created goodwill. You’ve created a—you know, they’ve, kind of like, decompressed, and said, “Oof, this is great.” Maybe there was a filter somewhere that you can remove and then off to value creation.

And to the idea of having that portfolio, you also want to be quick to grab the extra capacity that you just created for something a little bit more interesting. So, you may want to take that opportunity and say, “Okay, great. The solution was really simple, and then we’re going to give you some reporting to see how the subscription rates are going up.” But we want to take a little bit more of the capacity that we saved to then maybe work on another project to see if there’s something else we can do with sentiment data that may be useful for the marketing team. That’s how you buy your time for the more interesting progressive—well, that’s how I buy my time, rather [laugh]—

Brian: Yeah, yeah.

Cindy: —for the more interesting and progressive ideas because it’s a dialogue. Versus like the Big Bang, if we do a sentiment analysis, we will understand what all of our customers think and we’ll give them exactly what we need. Very hard promises to make, if what they’re still focused on, yet and still is, “I just need better clickthrough rates.”

Brian: Right. Is this skill something that you train people in? Do you hire a special person to only do that? Do you hire data people that you think have that? How do you approach that, having more people than just Cindy because Cindy can’t scale. How do you approach that?

Cindy: In a mature state, I have a dedicated data product manager role. That is something that—but in a rough and tumble, we’re starting one person on the team kind of thing, or I have a data engineer; that’s all I got. Everyone should understand product. And even just creating the language of product is, I find, very helpful in creating a center of gravity for everyone. Even, like, your software engineers, your business teams, the fact that we can just put a circle around here’s what the data product is.

And then putting it in equal footing from, like, a legacy report, and a snazzy dashboard, or this model that we’re creating, they’re all products. So, it’s where we invest time, it’s how it’s meant to connect to a certain piece of value in the business strategy. It’s a really great forcing mechanism to create an environment where everyone thinks in terms of value. And the thing that helps us get to value, that’s the data product. I think it’s valuable; like, data scientists, data engineers, that you know what’s your role relative to that data product?

Brian: I totally agree, too. This data product management role is really important. I also like that you didn’t call it an AI product management role because that, to me, suggests the solution. Like, “I’m going to use this hammer on every project.” But joking aside, I wanted to ask you a little bit, kind of close this out about strategy again because that is—you’re head of strategy.

And Samir Sharma has a show—I think you know him—that’s how you found me actually, was through the episode that I guested on his show, and if I recall correctly—and I think it’s Samir—his focus is that there really is no data strategy; there’s a business strategy that includes data components. And in the beginning of this episode, you mentioned I think data is a business unit. So, I’m curious your take; do you agree with him? How do you reconcile whether or not data is actually a business function or data is a supplement to the business? Is it kind of subordinate or it’s its own thing? It sounds like it’s its own thing in your world.

Cindy: Yeah. Again, this is Cindy view of the world. I see—I agree with Samir, but how I agree with him is, yes, there is no independent data strategy. It is a function of the business strategy, and in execution, you create a data function that is a partner to a sales function, or a marketing function. These are all value creators to the business goal.

So, that’s how I reconcile that statement. Where I still think that maybe what I’m saying still sounds odd or confusing is, because a advanced practitioner of data is very technical, right? In the sense of like, can code, knows statistics, can do these things, but I think the taboo around that will fade over time and we’ll just look at it as a business function, no other than any of the others that I just mentioned.

Brian: Mm-hm. Well, this has been a really great conversation. Tell me what’s coming up for you. Where can people find out about your work and follow you and all of that? And you mentioned a white paper at some point; do you publish some things once in a while? But tell me about that?

Cindy: Well, no. [laugh].

Brian: [laugh].

Cindy: I haven't but, you know—

Brian: But something might be coming, is on the way.

Cindy: Something might be coming. Look out. Well, I want to get better at posting at LinkedIn, but people can find me on LinkedIn. What I—look, I personally, aside from my responsibilities or my day-to-day, I’m really enthusiastic about helping the next generation of data professionals. Like I said, I think our role is still evolving, and people are still learning, so the more people that are pushing in the right direction, the better. So, what’s coming next for me, both growing my team where I am, and also happy to help folks who are still trying to figure it out.

Brian: Yeah. This is great. I will be happy to link up your LinkedIn there. And it’s been a really nice conversation, so thanks for coming on the show.

Cindy: Absolutely. Thank you so much for having me, Brian. It’s been great.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe for Podcast Updates

Join my DFA Insights mailing list to get weekly insights on creating human-centered data products, special offers on my training courses and seminars, and one-page briefs about each new episode of #ExperiencingData.