144 – The Data Product Debate: Essential Tech or Excessive Effort? with Shashank Garg, CEO of Infocepts (Promoted Episode)

Experiencing Data with Brian O'Neill (Designing for Analytics)
Experiencing Data with Brian T. O'Neill
144 - The Data Product Debate: Essential Tech or Excessive Effort? with Shashank Garg, CEO of Infocepts (Promoted Episode)

Welcome to another curated, Promoted Episode of Experiencing Data!

In episode 144, Shashank Garg, Co-Founder and CEO of Infocepts, joins me to explore whether all this discussion of data products out on the web actually has substance and is worth the perceived extra effort. Do we always need to take a product approach for ML and analytics initiatives? Shashank dives into how Infocepts approaches the creation of data solutions that are designed to be actionable within specific business workflows—and as I often do, I started out by asking Shashank how he and Infocepts define the term “data product.” We discuss a few real-world applications Infocepts has built, and the measurable impact of these data products—as well as some of the challenges they’ve faced that your team might as well. Skill sets also came up; who does design? Who takes ownership of the product/value side? And of course, we touch a bit on GenAI.

Highlights/ Skip to

  • Shashank gives his definition of data products  (01:24)
  • We tackle the challenges of user adoption in data products (04:29)
  • We discuss the crucial role of integrating actionable insights into data products for enhanced decision-making (05:47)
  • Shashank shares insights on the evolution of data products from concept to practical integration (10:35)
  • We explore the challenges and strategies in designing user-centric data products (12:30)
  • I ask Shashank about typical environments and challenges when starting new data product consultations (15:57)
  • Shashank explains how Infocepts incorporates AI into their data solutions (18:55)
  • We discuss the importance of understanding user personas and engaging with actual users (25:06)
  • Shashank describes the roles involved in data product development’s ideation and brainstorming stages (32:20)
  • The issue of proxy users not truly representing end-users in data product design is examined (35:47)
  • We consider how organizations are adopting a product-oriented approach to their data strategies (39:48)
  • Shashank and I delve into the implications of GenAI and other AI technologies on product orientation and user adoption (43:47)
  • Closing thoughts (51:00)

Quotes from Today’s Episode

  • “Data products, at least to us at Infocepts, refers to a way of thinking about and organizing your data in a way so that it drives consumption, and most importantly, actions.” - Shashank Garg (1:44)
  • “The way I see it is [that] the role of a DPM (data product manager)—whether they have the title or not—is benefits creation. You need to be responsible for benefits, not for outputs. The outputs have to create benefits or it doesn’t count. Game over” - Brian O’Neill (10:07)
  • We talk about bridging the gap between the worlds of business and analytics... There's a huge gap between the perception of users and the tech leaders who are producing it." - Shashank Garg (17:37)
  • “IT leaders often limit their roles to provisioning their secure data, and then they rely on businesses to be able to generate insights and take actions. Sometimes this handoff works, and sometimes it doesn’t because of quality governance.” - Shashank Garg  (23:02)
  • “Data is the kind of field where people can react very, very quickly to what’s wrong.”  - Shashank Garg (29:44)
  • “It’s much easier to get to a good prototype if we know what the inputs to a prototype are, which include data about the people who are going to use the solution, their usage scenarios, use cases, attitudes, beliefs…all these kinds of things.” - Brian O’Neill (31:49)
  • “For data, you need a separate person, and then for designing, you need a separate person, and for analysis, you need a separate person—the more you can combine, I don’t think you can create super-humans who can do all three, four disciplines, but at least two disciplines and can appreciate the third one that makes it easier.” - Shashank Garg (39:20)
  • “When we think of AI, we’re all talking about multiple different delivery methods here. I think AI is starting to become GenAI to a lot of non-data people. It’s like their—everything is GenAI.” -  Brian O'Neill (43:48)



Brian: Welcome back to Experiencing Data. This is Brian T. O’Neill. Today, I have Shashank Garg from Infocepts on the line with me. How are you?

Shashank: I’m doing great, Brian, and it’s a pleasure to be on the show.

Brian: Yeah, yeah. You all have a data and AI consulting firm, you do a lot of work with large enterprises, and we’re going to talk about data products today. And so, when I get into this topic, the first question I always ask is, like, let’s start with a definition so that we all know what [laugh] we’re talking about here because there’s a lot of different definitions. So, I want to challenge you, like, right, from the start, can you give me a really brief, like, ideally, like, one sentence or something short, so we kind of have your context, as we listen to your ideas and thoughts today? What is a data product?

Shashank: Trick question.

Brian: [laugh].

Shashank: I was going to give you, like, a one-pager, but actually the simplest way data products, at least to us that Infocepts, refers to a way of thinking about and organizing your data in a way so that it drives consumption, and most importantly, actions.

Brian: Got it. So, are we in the vein of decision-making? Is this kind of what we’re thinking about here?

Shashank: Yeah. And all along, we’ve done work and help our clients do this, and that’s why this field of data and BI existed. But what we’re seeing increasingly is, not being able to track and report on actions that were taken—especially if you’re going to invest in building a data product—is not good. So, decisions and the actions you’re going to take, find a way to integrate that into the data product itself.

Brian: Got it. So, are you talking about the literal tracking of decisions made such that you can then figure out if you’re making better decisions?

Shashank: In some cases. I don’t think it’s possible everywhere, but in some cases where it is, and if it’s going to spawn off new initiatives that you want to do, I think the best example to explain is, we have a product that we’ve built called Employee360, right, where you make certain hypothesis and you analyze data, and you look at trends, and then you say that, for better retention of my employees, or better engagement, I’m going to do X, Y, and Z. And generally, in the old mindset, it will just be a dashboard or an Excel or a report. And then somebody would [hive 00:03:07] off the workflow from there, get into a PowerPoint and then get into some monthly reporting. And then the underlying data or the hypothesis, which is continuously changing, there is no linkage, right?

So, in the same data product, you just create some basic workflow initiatives, action, and then every month when you go and revisit, see progress, and the underlying data itself is changing. So, you have the ability to change those things, and over a period of time, you can track actions against the success of the actions that you said you were going to do.

Brian: I have some members in my community, this Data Product Leadership Community, who might say that the customers are part of the problem here [laugh]. The business doesn’t really want to get better at this. I mean, they say they want to get better at it, but like, “Oh, so now you’re going to track what I did, huh, and you’re going to let me know whether or not I did a good job?” And I’m being a little facetious here, but this idea of, even for example, just getting into, like, well, how do you make a decision today such that we could make it better with a data—we could create a data product that would either make that easier, more enjoyable, reducing cognitive load, whatever it may be? Right there, there’s friction because now we’re challenging the way this person used to do things.

And this seems to be a recurring hurdle, and it’s a big theme in the podcast here, right? Like the user adoption piece becomes a challenge because the incentives are all misaligned sometimes. And it’s not that we can’t make the technology part of it, but the adoption piece is a challenge. So, what’s your opinion on this?

Shashank: Yeah. I wouldn’t sit here and claim that it’s not challenging. It of course is. And that’s why you’re running a podcast [laugh] on this topic.

Brian: Yeah.

Shashank: But you know, I’ve shared examples of where it really worked for us, and examples—I’ll share two examples, right: one example where this really worked well, and one example where it didn’t, and we should have known upfront, okay? And one example, I’ll pick from a media client of ours. And the whole inception of the data product happened when the teams who were providing snippets of data using, like, maybe a BI tool at that point, and they were getting this continuous [unintelligible 00:05:17] requests, right, from the sales operations, or sales planning teams. And you see this continuous questioning, right, and sometimes you wonder why. “I just gave you this, and then you’re asking me to different [unintelligible 00:05:26]. Like, why?” Why, right? What’s really happening?

And you quickly realize that the whole ad spend planning, and attribution of ROI—so ad effectiveness—is a very, very complex topic. And no matter how much you model it today, and how much time you spend in architecting it, it sort of changes. And then the users are still wondering, no, you didn’t really answer my question, right? So there, you know, a few years ago, that started as a journey, and then they started crafting what would eventually become a data product, but at that point, very much a pilot POC, hackathon origins, right? Put something together really quick and dirty, and the operations team started using for better tracking for better planning.

And the next thing you know, over the years, done right, it became a full-fledged added ad analytics platform with multiple data products, heavy usage, unlike what we see in the field of business intelligence, and analytics. This is sort of bread and butter, I can’t live without it kind of adoption, right? And as you can see, right, the reasons for that is because the whole inception was done jointly by the users. They didn’t know what a data product would mean, and their terminologies, and what it takes to really build a product. That’s the tech team’s job.

But defining the use cases, the highest value use cases, defining what’s going to change rapidly, and what’s going to be our change mechanism, right, to the whole thing, there was a lot of good alignment. Combined with some really smart product management, product marketing-type leadership provided by certain technology leaders, right? And if you look at the technology leadership, product management and product marketing doesn’t come naturally to them, but you know, you’ll often find some of these people who think like that. And I’m sure there are people in this community who think like that. So, combined with one of those leaders who went beyond give it a name, you know, give it sort of a marketing spin, what it stood for, and then the whole product kept evolving, right? That’s sort of one story.

Brian: Okay, actually, before you go into the second one, where these product people part of your team and your services to this client, or they were people inside the organization that were already titled and/or made responsible for the success of the solution? Like, how premeditated was it? And on which side—I’ll say, which side of the fence, the consulting side or they in-house? Where were those bodies located?

Shashank: Actually, great question, and I must say that even today, they are not titled. Actually, we don’t call it a data product, even today. [unintelligible 00:08:11] into the terms that are used internally there is ad analytics platform with multiple products on it. And no, all of them are like any other data organization: they’re roles like head of analytics, head of data, head of data engineering, platform support, there are these people, both on the client side and our side, who understand that this is much more than a one-off project or a dashboard or a report that I’m going to service. And this checks all the boxes that are the minimum conditions required to have a product type [play 00:08:52], which is multiple business use cases, multiple personas of users, requires a highly intuitive application interface, or just nobody’s going to use it because they are that demanding, clearly calls out the actions this analytics supports, and doesn’t really stop at just providing insights, but just makes recommendations, and huge investment in building a proper unified data layer with model data, supporting metadata, all that, right?

And that’s why I’m calling it a platform, so a data platform. Because we’re talking about lots and lots of data. We’re talking about petabyte-scale data, we’re talking about 100-plus sources. This is really complex stuff, right? But all of this is, I don’t think there are any formal data product management roles here yet, but the concepts have been sort of intermingled with how we approach this, both on our side and the client’s side.

Brian: Sure. So, was this a quote, “Exceptional” client then, in that this culture was just in-house because they just happen to have this assortment of people, or their was a directive to work this way? I would say it’s fairly abnormal, right, to take responsibility for, the way I see it is the role of a DPM, whether they have the title or not, is benefits creation. You need to be responsible for benefits, not for outputs. The outputs have to create benefits or it doesn’t count. Game over.

Now, was that sent down from on high, or these people were just, like, “I’m tired of making stuff no one uses, and I just want to make stuff that works?” Like, or, “My own career, I want to be able to show the value of the work I’m doing, or my team does.” Can you tell me a little the origins of this, and why at this place and not in other places? Or are they all like this? I don’t know.

Shashank: No, there’s certainly not all like this, and I think it requires a certain type of leadership. I would say it’s a combination of both of those two things. I don’t think there was any mandate that you have to create—obviously, every leadership will talk about, hey, you have to deliver value, and you shouldn’t be, you know, measured just based on the initiatives or actions that you take, but—or the insights you deliver, but the value creating. That is always the case. But most data leadership fails to deliver on that.

In this case, I think it’s a combination of high-level intent very clearly communicated that we have to be—because there are genuine problems which can only be solved using data. And I would say a lot of these decisions were not, like, high-level analysis done every quarter; this is day-to-day. It’s a combination of operational issues that are getting solved, along with clear inputs that go into planning, and segmentation, and creating more inventory, all the way up to the leadership recognizing on public forums that this team added a billion dollars in revenue. You know, that’s the ROI we’re talking about. But in terms of you know, what led to this, I would say that it is the brilliance of the—or the intent of the people involved, both from our side and the client side.

It is taking, learning from whatever is happening on this whole topic of—and at that point, the whole—this is five years ago, this is six years ago, the inception, right—and at that point, even the data product as a term wasn’t that well established, at least in this community that I’m talking about. Borrowing from product management, the principles, and applying them here because it checked all the boxes: it was complex enough, it was multiple personas, it was going to evolve over time, we couldn’t make all the decisions in one go, it wasn’t going to be a done deal, and we knew this will grow. So, unless it is, budgets are set aside for multiple quarters or years, unless it is every quarter, you show the ROI and you get into this cycle, we knew this wouldn’t work. And everybody put in the effort and results are there. It did not start with a multimillion dollar initiative to begin with. It was very much a case of some people got together, looked at real problems, initiated, and then kept building it on. Today, it definitely is a multimillion dollar initiative.

Brian: Mm-hm. So, it sounds like you’re saying a data product team—whether or not they have this label or not—but some team that cares about producing actual benefits—

Shashank: Yes.

Brian: —can make some progress, working this way, without having a mandate or having a huge organizational initiative, or McKinsey comes in and says, “The recommendation is you should build data products instead of the old stuff that you were doing.” And then they walk away and great, what does that mean? So, it sounds like you’re saying it can come bottom up.

Shashank: This is definitely a case of bottom up. But the organizational leadership, recognizing the value of it very quickly—A, the people who were building this articulating very clearly along with business—not just tech, right—jointly, “Hey, this is what we are producing.” I’ll tell you another part of it, right? I think—and this is interesting, right—I think at some point, what happened was that once you taste some success, right, the business stakeholders will build on it, and they will say that, “Hey, look how far”—this is not necessarily in this case, but I’ve seen happen in other cases, right, have the initial success and establishment, right—you go and become—and you make the case that how much more you can do. “Hey, so we can integrate all the external data now, and we can extend that to partners, and we can build clean rooms where this interchange of data will happen above this platform, and hence we will create that much more revenue or margin or efficiencies,” right.

And some of that didn’t work out. So, but I think at that point, when people conceive that some of this was possible. Was it possible? It was possible. But the business assumptions around their ability to onboard new partners, being able to sell that vision, and hence the ROI and hence the investment didn’t play out. Yeah, but definitely not what you are saying, which is, you know, downward mandate, coming right from the top: “Here, go build data products.” No. This is just working through the trenches and building it up.

Brian: Yeah. I’m glad to hear you say that. I think it can feel, especially in large and even more so in very risk-averse… organizations [laugh] that, “I believe in all this, and it makes a lot of sense, but”—you know? And then there’s the culture, and there’s the way we did it yesterday, and there’s all the risk, and like, any type of risk is dangerous, and it’s really hard to push through it. So, I’m glad to hear that there are examples in the wild here that a small team can start to—you don’t need to be management to be a leader. You can start to change that culture. Doesn’t mean it’s easy.

And they might have some luck here. Like, maybe they have the right culture there, or the stars align properly in that case. And I do want to talk about what maybe the average client experience looks like for you. If this is the exceptional one, maybe you can tell me a little bit about what’s the average climate you’re walking into? And I want to start with, do you have a buyer come to you and say, “Hey, dear Infocepts, we want to build some data products around here.”

I kind of don’t care what the labels are here. I mean, we talk about data products on the show because we all get the joke, and we know what we’re talking about here. Are they coming in and asking for this, or are they coming in with a technology solution that they want made? Are they coming to you with an actual problem with no predefined solution baked into that request? I’m kind of interested to hear what the top of the funnel looks like when you get on a call with a new prospect. What’s the language, what are they talking about, and what’s the climate inside their business like?

Shashank: Yeah, so I think for us, at least, right, and for most of our clients, and when we are going and looking at new business and talking to prospects in shows, we rarely use—at least us—we rarely use the data product language and whether it requires a project or one time or a platform or a product. It’s often about what problem are you trying to solve. And it is either inventory optimization, or sales planning, or revenue lift, or on the cost side, right? And as you dig deeper into those things, and you talk to the users, and you talk to their tech teams which are producing this data, in today’s world, if you’re [walking 00:17:18] to a new client, you will have some semblance of a data platform. It may be an old-style warehouse, it could be a lake, but just something will exist, and some sort of cloud pieces will also exist, that’s fairly common.

But it’s often what we realize is, when new clients come to us is that gap, you know? We talk about this bridging the gap between the worlds of business and analytics. And if you surveyed any data leadership, they’ll say, “Oh, 90 percent-plus of our products are yielding business value.” And you will survey the business users, and get a big gap. You’ll see, like, less than 39 percent. And that gap has stayed consistent 40, 50% point gap between perception of users versus the tech leaders who are producing it.

And the wider the gap, you know that something is broken: the coordination is broken, the way they look at things is not working out, and that’s your opportunity. And you apply the checkbox that, hey, this is a sustained problem that’s going to require continuous effort and is going to evolve over time. And we see, are these type of users, once they see merit, will continue on and continue to invest in this? Then you, you know, take the data product approach.

But if any—actually, in most of those approaches also is very—because mostly when you’re coming in, that trust is not there, so you want to show some quick success. And it could just be Excels. It could just be, you know, Power BI or some sort of output visualization in the tools that they use, right? And you show that success, and you build the case, and then you—it is often six, nine, twelve months cycle to be able to get [unintelligible 00:18:55] all the support and budgets to ensure that, you know, something like this can be built, you know?

Brian: And typically on the other end of your call, are we talking to someone in IT, or a data team that needs additional resource? Or you talking to a business sponsor who either literally doesn’t have the skill in-house or feels like they don’t have the skill in-house? I’m trying to get the perspective of the buyer that you’re working with. So—just to put this in context a little bit more?

Shashank: Yeah. Actually, we can go both ways. Sometimes we are talking to the data leadership. There, the problems are very well-defined, which is tech debt reduction, consolidations, migrations, more capacity to do stuff that we are already doing because the skills are hard to have. Or in many cases, I have realized that this is a candidate to build a sustained platform product type approach. I don’t necessarily have the skills. Looks like you guys do.

And there, the job is actually easier. I wouldn’t say success is guaranteed because at that point we don’t know how the business is going to respond, but at least they have a [unintelligible 00:20:01] that something like this is going to work. So, that makes our job a little bit easier, at least to start with. But then on the other spectrum is the business side. And they can more than likely they are in the bucket of they are coming directly to us, or we happen to talk to them where they have a problem, but they believe their teams are busy doing things, or they’re not getting the right attention, or they’re spending the money, we are building the stuff, we are putting stuff in Snowflake, Redshift, we have slapped BI tools on top of it, we have custom interfaces, but something about that always falls short of expectation.

That is a harder problem to solve because there’s already stuff that you’re investing in, but something about that is broken, and you can’t articulate what and why. And [originally 00:20:51], the tech skills, the whole technology stack and the process that you’re using, along with the users, right? So, you have to diagnose. And there the approach of, okay, can I solve this problem one time for you, and make it look ideal, right—like the short wins, right—and if I can make that happen, then with all the principles in place, and the approaches in place, right, persona-based approaches, you know, prioritizing by business value, we do all that, and then we should be on a glide path from there.

Brian: I want to zoom out for a second. We started to go down, and I want to go back up for a moment. Why are we talking about this right now in 2024? Like, business value from data. Of course, I want that. And we’ve all been talking about it. And you’re saying there’s a 50 percentage point spread between what the business says the data team is providing and what the data leaders say that they’re providing in terms of value. So, there’s a gap here.

How is this whole data product thing, changing any of that if I was to play devil’s advocate in the role of a skeptic, I would say that that sounds awfully like a new label for an old thing, and it’s kind of like, we’ll just throw Agile on it, and now it sounds like we’re making everything faster. We’re getting better stuff, and we’re getting it faster, just because we put this name on it. So, unpack that for me. Like, what’s different now in 2024? If I was a buyer on the other end of a call, like, what—and I know that you’re not necessarily talking about the how the sausage is made, maybe with them, but for those of us that do understand this, like, what’s different? Like. What’s wrong with the old way?

Shashank: Well, first of all, to answer why are we still talking about it, and I think that I’m sure people recognize that a practitioner would know the gap arises due to the unique nature of data and analytics solutions, right? I mean, it’s a confluence of data, technology, people who are implementing that technology—which is the IT leaders and the business users, right—sort of three or four things that have to come together to be able to—and by nature, IT leaders often limit their roles to provisioning their secure data, and then they rely on businesses to be able to generate insights and take actions. And then sometimes this handoff works, and sometimes it doesn’t because of quality governance, you know, you know all the issues that exist in this field, right? So, that’s why the gaps exist, right? And I’m definitely not in the bucket that you sort of slap Agile on everything.

And by the way, if you remember the Agile—I’m sure you do—we didn’t. Actually, there was a phase where we just slapped Agile on everything, and things broke [laugh], right? But now both coexist, right, and we are far more Agile than everything else, right? And it wasn’t the panacea, but people generally believe it made things better. I think this is one of those things that are similar.

In cases where it is not a one-time done deal and fairly static or can be quickly addressed for that, you know, changes are relatively easy to make—which as I say it, I’m realizing may very well apply to many of the cases, right, with the way data is getting used these days—I do think that a data product approach works better because it puts so much emphasis on the user personas, the use cases, or in this case, analytical cases that they value the most, so there’s the relative prioritization, and their involvement just happens to be better. That’s all. That’s what’s really changing here versus the old time of delivering, where, hey, I’m going to put a data platform together, which is all these fields and all the data. I have done a one-time assessment of what my users want, and I’m just going to produce it, and produce a platform, and assume that people will find what they’re looking for on this platform. That’s the old approach. I don’t know if that answers your question [laugh].

Brian: [laugh].

Shashank: Not an easy one.

Brian: No, I—well, that’s what we’re here, is getting the dirty, the difficult ones. You’ve mentioned personas a couple of time. Personas have a defined meeting in the user experience design, community and profession. I’m curious, getting to the use cases, people’s attitudes, beliefs, their aspirations, all this stuff requires actual end-user research which has nothing to do with building anything at all. Yet so often, we hear, “We don’t have time for that, Shashank. We need to get this out tomorrow. Like, we should have been doing this six months ago, and theta.com, they’re already pushing this out now. We know what the ad tech team needs. We know what the marketing team needs. They need CPM over X, every six months slice by week, on average. They need costs per lead. Can you do it or not? We already know what they need and they’re bogged down in the campaign for Christmas coming up, so they don’t have time to talk to you.” Am I playing out a realistic movie that you hear sometimes or [laugh]?

Shashank: No, I think what you’re saying is very realistic. And what you just said can come from a CIO, or Chief Data Officer, or it can also come from head of a business at the top, right? But I think there’s no examples that we are able to show which says that, although you might, but it is this 19 other variations of that CPM view that you’re missing which are really important to this one user and this other user. And—

Brian: But how do you know that? That presupposes that you had exposure to people to even know that there are 19 other use cases? Right there, that assumes that somebody had an exposure time to an end-user. I already know design—user experience teams who are literally tasked with this job regularly have trouble getting access to customers because the culture is against talking to users. They think they know what everybody needs.

So, for inexperienced teams that don’t do user and customer research regularly, I can only imagine it’s even harder because they don’t have the skills necessarily, or the training to navigate those blockers, and to help the organization learn how to slowly change their perception here. So, I’m just kind of curious, like, how do you navigate this kind of situation here where they think they know what they need? And maybe you’re coming in with an objective, like, maybe they do need CPM, but we need to understand the context in which CPM is used in order to figure out how to productize it properly. How do you navigate that? Or am I just making this up? I don’t want to—

Shashank: No, no, no, I think this is an absolute reality of the whole field of data, I would say. I would say there are two, three things that help you, right? I think it is a generally accepted norm now that whenever you kick off something, there is some sort of whiteboarding or a design thinking kind of session where you engage at least the representative source, users of the application. And then using, sort of, rapid prototyping techniques, the more effective you can make those sessions—nobody’s going to give you unlimited access to all those users who are busy running those campaigns, as you say, right, Christmas campaigns, right, but to be able to bring the right set of users for one, two, or three sort of initial meetings where a lot of those storyboarding sessions are happening are the keys to success. One thing that does really help us, Brian, which has become very—not very, but relatively easier than what it used to be ten years ago is, you can have with the advancements in the tech provided by the hyperscalers, with some of the platforms that we use, and some of our own, sort of, rapid prototyping techniques and accelerators, right.

So, our expectation is when we build data products—so what we’ve done, at least at Infocepts, Brian, we have a productized platform called Decision360. And Decision360 is sort of this tool, not a tool, like, a platform, which has pipelines, data pipelines built in, it has got some analytical modules in it, it’s got to UX, the UX has four or five different views with different pages for each personas. So, there’s whole, sort of, loosely coupled infrastructure already exist. So, in the point what you said, right, for us to rapidly prototype it and show it to you in that first storyboarding, if not the first or second storyboarding session is very easy.

And data is the kind of thing, right, I may send you a questionnaire; you’re not going to respond. The moment I show you something, and, “No, Shashank, this is wrong. No, Brian, how could you say this negative—how can attrition rate be negative?” Right? So, the data is the kind of field where people can react very, very quickly to what’s wrong.

So, the trick is, between the first week, and the second week, and the third week that you’re running this cycle, you’re making very rapid iterations, knowing this is fully wrong because you got no views. And the moment you put something wrong, everything comes out, right? So, kind of use a combination of those techniques, along with pre-built I don’t want to call is accelerator, but like, a platform where your ability to prototype using real data is very, very fast.

Brian: Got it. So, just to play this back, you’re saying that during this, kind of, early discovery phase, you all try to prototype some type of a north star or a vision—

Shashank: Yes.

Brian: Of what it is, get it visual quickly because having it visual is what gets people invested in saying, “No,” or, “Not quite,” or, “No, it needs to be this,” or, “Yes, and.” Is that kind of what you’re saying?

Shashank: Yeah, I mean, just to summarize, right, if you imagine ten years ago, that would be an Excel file, an Excel wireframe. 15 years ago. Then, you know, five, seven years ago, it would be, like a—and now, Figma. But you know, earlier, whatever the equivalent of prototyping, UX prototyping technologies were, right? Today, for us, it is Power BI or Tableau-based UX. And they can click and feel it, and they can slice, and then go down within the first two weeks because everything is rapid, and everything is 80, 90% ready.

I mean, you know, in the field of data to go from 80—like, it doesn’t take too much to get to 80, 90%. But to go from 85 to a 99, or 100 can—that’s where you put all the rigor, right, the engineering, and the quality, and the governance, right? But just to visually see on where we are headed, should be very fast. Like, I don’t want to see any more Excel wireframes anymore. Not needed.

Brian: But even before we get into digital prototypes like with Tableau or Power BI, prior to that, there has to be some kind of disc—well, there doesn’t have to be, but it’s much easier to get to a good prototype if we know what the inputs to a prototype are, which are, data about the people who are going to use this, scenarios, use cases, attitudes, beliefs, all these kinds of things. What kinds of roles are in the room doing this ideation work, or brainstorm, or design thinking, or whatever you all call it? Who’s in the room? Like, what roles are kind of like minimum requirements for you all when you do an engagement? Who runs that?

Shashank: So, in our case, like, generically, we’ll call it a solution team. If you look at the process we follow, it often begins with the other side, on the business side, some user frustrated, or a visionary guy who’s thinking two steps forward—many more frustrated use cases than the visionary ones, honestly—because they have a problem. And the more painful the problem is, the easier it is because they are that much open to give you all the information that you just talked about. So, that’s from the client side. From our side is going to be a combination of somebody who has had background in just doing all this. And for us, we just call it generically, like, a solutions consultant.

But they have had experience in design thinking, they’ve had experience, a good understanding of how data works, and just—and I’m not getting the right term, but in terms of people who are good at prompting people to speak. I don’t know what you would call it, right, but a combination of those skills are the people who first engage. And what we have seen is, sending long questionnaires and asking people to document just almost never works in this case. And another thing that works for us is through our what I called as a Decision360 platform, we do a lot of show and tell. So, even for the users to develop their own thought process, there’s a lot of use cases, and over time because this is a—others have done it in the industry, right, so the more you’re able to show them, “Hey, when we working with XYZ company, they did this, this, and this. Do you think something like that? Or what about this? What about this?” Right? So, having them be able to, like, an explore use case… library also allows them to develop their thinking about how on what possibilities they could do.

Brian: One of the things I was curious to get at is, like, so if you have, like, a—‘a’ meaning, like, one—or two representatives from the client in the room during these early ideation phases, in my experience, what often happens, it’s like, we’re going to send Mark—so let’s say it’s your ad tech; I’m just going to pretend—it’s Mark from marketing. Mark manages the ad campaign managers. And Mark used to run his own campaigns. And he was really good. He got the lowest cost per lead on average, and seems to know something about designing ad creative, and working with agency and blah, blah, blah. So, we’re going to put him in charge of management, right? Well, he hasn’t done that for five years, but Kathy is running campaigns all the time.

And I guess one of the things I talked about is, like, Mark is a proxy for Kathy, but we don’t know if Mark’s way of working represents Kathy’s way of working. Is it out of date? Is it that Mark actually had a very convoluted process that is just his own thing, and it worked, but it’s not repeatable easily, and it’s really out of touch. You know, and Kathy has different incentives now than Mark had because Mark didn’t—there was no management team. Mark was just as renegade guy over here that would magically creating amazing ad campaigns, and now there’s all this structure and incentives and blah, blah, blah, and this is why Kathy demands CPM—you know, on a weekly average CPM because her incentives are aligned that way, but we have a proxy telling us what Kathy needs.

I’m wondering if you’ve seen this, and how do you navigate that, especially if your consultants, they’re spidey senses going off, and saying, “Mark, I hear you I know that you really want the CPM metric. It sounds like that’s really important.” In their head, they’re like, “That’s not going to help you, but there’s something here behind this request. I know there’s a reason you want this, and you think that this is going to help you, and we could make it, but we don’t want to waste your money.” How do you navigate that to get that access to Kathy to do that research or whatever needs to happen there? Am I making this up? I don’t know if I’m making up a realistic scenario. But I have seen this a lot where you get a proxy, and the proxy really is not indicative of the end-user.

Shashank: I agree that it happens quite a bit, but I also think that it’s getting better. And I almost want to say that it’s a little easier in the field, should be relatively easier, in my experience, at least the experiences we have had in the field of data. I mean, we do two or three things. Obviously, you apply the five whys. It’s very popular at Infocepts. You know, “Okay, you want CPM. Why?” “XYZ,” okay, you understand why, and then you keep probing. I think just helps you dig deeper and makes that Mark guy realized that he may be missing something.

But that doesn’t work always. So, you get into something, you build something, your ability to iterate more and more, and bake in iterations in your initial estimation is how we handle it most of the times because regardless of how much are you going to argue and ask access if that person is not willing to give you that, they won’t, and then you will miss the mark. And actually, there’s another thing, Brian, right? We see, “Oh, I want to see that report exactly like that. Remember, then that column needs to be here, and then this row has to be here. And of course, you have to click a button and export to Excel,” and you can’t explain why.

I mean, that’s the standard in the field of data delivery, right? But I think some of that has gone down over the years. I don’t get this as much as we used to, even three, four years earlier. And I think a lot has to do with just the general awareness of how important it is, and people are giving this a little bit more attention. The prototyping has become really fast, the providers like us have become smart and said, “There’s no point in writing long documents and creating Figmas.” You know, let’s just put this whole thing down in less than 30 days, have something in their hands, and knowing it is broken, and set the expectation that, guys, this is going to iterate three times.

And you know, you’re not writing CRs or change requests when the guy changed the whole dashboard on you, or the whole analysis on you. It’s okay, as far as the data set remained the same, you’ll look at it another way, no problem. I think the incremental cost of change in developing stuff has substantially dropped as well. The more multi-skilled people you have who can do data analysis, who can do data engineering, can also build reporting, can also talk about the analytics and the predictive stuff, so the more cross-skilled your people are, the easier this becomes because there’s less friction. You have to bring the cost of change down because the problem that you’re trying to explain, you don’t have control over that. You can try and explain the Mark guy [laugh], but if he’s not going to get it, he’s not going to get it, and you have to find alternate ways to achieve the same thing.

And if I were to summarize, sort of, having a pre-built set of accelerators or platforms which allow you to reduce the cost of change, having multi-scale people who can span the whole—like, for data, you need a separate person, and then for designing, you need a separate person, and for analysis, you need a separate person—the more you can combine, I don’t think you can create super-humans who can do all three, four disciplines, but at least two disciplines and can appreciate the third one that makes it easier. Yeah, and just rapidly iterating is how I would look at it.

Brian: Mm-hm. Are any of the clients that you’re working with, do you find that they have successfully or they are currently in progress, in a measurable way that you could see adopting a product orientation such that the person that you’re working with has a responsibility to deliver benefits to the org? So, in this case, this Mark, the ad tech stakeholder on the other side is a data product manager or whatever the title is, but this person has responsibility, let’s say, it’s too, “I need to reduce the human labor it takes to onboard, launch, and A/B test a marketing campaign. We want to try more campaigns per week. It takes forever to get one out the door. I want my team to do four per week and they can only do two. That way we can better optimize our ad spend because we ran four experiments instead a set of two, and we could, instead of waiting, you know, eight weeks to test out a bunch of stuff, we cut it in four weeks, so we’re just spending less money.”

So, there’s a financial incentive here. We also make our team happier because it’s less laborious for them to run these experiments, and they feel like they get to the optimal campaign faster. Do you see this product orientation happening at all on the buyer side of your clients, or is this a pretty rare, rare thing?

Shashank: I do see increasingly that orientation to be much more than ever before, at least the types of clients that we are working with. I don’t know how much of that is attributed to our teams who just work day-in-day with—hand-in-hand with this whole, sort of, product management and product marketing kind of approach. But even without that, Brian, the value orientation is definitely there. Everything that is happening in the data, and AI community, and with the whole GenAI hype, and then everybody pointing to data enablement for all the way up to the board, I think we are—I call it good times for data [laugh] professionals, especially for the data product community, right? I mean, you couldn’t be living in a better times than this where there’s pressure from the board, there is people asking questions: “Are you being data-driven?” “Is this backed up by our own analysis?” And, “What are you predicting?” Right?

I think that level of awareness, that’s a sea change, pre-Covid, post-Covid, at least in the last three, four years. And I do think that a lot of the data professionals may not have adopted the, sort of, product mindset fully. They still may relate their jobs to be able to, hey, we’ve created the data platform, and you know, let’s work with the users to what they define. But anybody who’s on the value delivery and don’t treat data as a pure IT project is definitely there. And I see that just going up.

In fact, you know, sometimes we internally talk about it, and when we scan all our work that we are doing across all our clients, we are in the bucket, which says hey, not everything requires to be built as a product. Because there is overhead. I mean, you’re just talking about, you know, like, the user research and access to those guys, right? I mean, that’s more work, that’s more time spent for everybody. That’s expensive. And then to be able to manage and maintain over time as another line item is additional things, right? So, I think the more people can establish—and then, we clearly establish at all our clients, like, clear scenarios as to why you would want to invest in a data product-type approach versus in cases where you won’t is also very useful.

Brian: Mm-hm.

Shashank: And not everything requires a real product.

Brian: I agree with that as well. I think it’s larger initiatives that the amount of value that’s on the table, the frequency of reuse, how many literal human beings are using this thing, is this a special thing for the CEO, or is this a [laugh]—the entire—the ad delivery campaign management team is using this thing? So, I fully agree with that. You mentioned AI here and just kind of as we start to wrap up, is the solution space of AI, particularly GenAI right now, but AI—we’ll say machine learning because in the analytics world, when we think of AI, we’re all talking about multiple different delivery methods here. I think AI is starting to become GenAI to a lot of non-data people. It’s like they’re—everything is GenAI [laugh].

But since we know that that’s not the case, is any of that delivery mechanism changing anything about having this product orientation, or are there new risks, particularly on the user adoption piece, or the value creation part for the business or any of that? Putting aside all the technology issues with clean data, and ethics, and bias, and the pipelines you need, and all that stuff, all the input stuff, which I understand there’s a lot of work there, but is it creating any type of new challenges or things that leaders need to be aware of that’s different because the solution itself is predictive in nature. It’s guessing. It’s not perfect. It’s giving out these answers that are not always the same. Is that changing much or does it really matter? It’s kind of like doesn’t really matter what the delivery mechanism is.

Shashank: No, I think you bring up a good point. I will just share some of my thoughts on this topic, and it may or may not exactly answer the very question you asked. But first of all, AI has existed for the field of data forever. We call it the ability to predict. So, what is AI? It is the ability to predict, right? You’re predicting a forecast—a financial forecast, a sales forecast, whatever that may be, right—with GenAi, it is predict the next word, and hence, summarization.

So, for me, in the simplest form, we internally look at the ability to predict and the ability to summarize as the overall field of AI, and second as GenAI. With that said, there is a huge demand or surge in asking on how this becomes far more easier and conversational because of the summarization and multiples capabilities of GenAI. I have largely seen most clients being very cautious, and not going in the direction of slapping a GPT-type, or GenAI model on top of all their data products. That’s not the approach that I’m seeing. Which is good.

Although you see vendors releasing their conversational interfaces and conversational—as a user, you can talk to the analysis now; as a developer, you can develop using GenAI; as a designer, you can design GenAI using GenAI. I think we have to tread that cautiously because as you said, right—and data is all about accuracy, in many cases. So, I think we have to be very, very conscious that which I think most people, experienced people in data are. One thing that I would caution the business users is to not fall prey to, hey, I can come in and solve your sales analytics problem that last $10 million worth of spend over the last ten years has not been able to because I’ve got this new GenAI chatbot. As far as we stay away from that, I think all of this is good.

It is making real use, it is helping us be more productive. Some of the things that I talked about, Brian, earlier on having the same people do multiple roles, I think copilots can really help, right, because now I don’t need to wait, you know, and I can quickly mock something up, which—and I may not be the best designer out there, right, because but I can create, like, a wireframe, at least on my own now, right, if I was a developer. I think some of those things are very, very positive. But yeah, let’s not assume that this helps you fix… let’s just say that, you know, this is not really, there’s no quick answers here. So, it is a complex problem, and you need experts.

Brian: Sure, sure. Well, it’s been great to talk with you, I know you have a resource called “Top 5 data and AI Initiatives for Business Success” I’ll drop the [link 00:47:52] to that in there, but can you give people a preview, like, listeners a preview of what’s—who exactly is that for? Like, what kind of person is best spent their 10, 15 minutes to read that or however long it takes? Like, who would you say the target is for that?

Shashank: Yeah, it’s just a summary of what we discussed here. Maybe not as much on the data product side, but you know, from our experiences, whatever is going on today, it’s the data leadership, data product leadership, it is the people who are building analytic products, business users who are looking to go after this journey and are wanting to revamp, or you know, going on this journey, they will find this really useful.

Brian: Got it, got it. I’m going to give you the last word if in case you want to share anything I didn’t ask about today, but in terms of, like, Infocepts and your organization that you run, so you’re in professional services, you provide analytics and AI consulting services, who’s, like, the primo customer for you? Like, if you could just say, “Oh, I love working with people like this,” what do they look like? Like, who is your ideal customer?

Shashank: I think you kind of covered that when you were talking about the problems earlier that exists, right? I think we all want to work with people who truly believe in the disruptive potential and value that data can bring to their businesses, right? So, someone who is in a competitive space who’s looking for truly embracing and analyzing internal data, external data for driving both their operational—and I use the word operational all the time—and hopefully strategic decisions as well, right? I think the belief has to be there. Alignment obviously helps.

They may or may not understand all the time on how to make this data technology people, business users, all of them, how do you make them dance and tango together? That’s okay. As far as the intent is very clear, and they’re willing to walk with us in that direction, those are the kinds of people that we love working with, and fortunately with everything that is happening in the whole field of data and AI and thanks to ChatGPT as well, we are finding more and more of those people.

Brian: That’s good. Any particular industry specialization or problem space specialization or anything like that, that you think Infocepts is, like, particularly positioned well at solving for?

Shashank: We thrive on complexity. So, essentially looking for—I mean, that problems are complex where you have to build a unified data product layer, you’re looking at cutting down costs at a massive scale, you are about to do a transformation, you’re looking for rapidly building multiple products for multiple teams and rapid deployments. Like, if pace is really important to you, I think Infocepts would be a really good fit because of the way we’ve organized our knowledge. We call it a set of foundational components on which we build two or three platforms. One for the AI side, one for the [unintelligible 00:50:42]360 side, one for managed services. So, for those, will be a great fit.

Brian: Got it. Excellent. Shashank, it was great talking with you. I do have your LinkedIn profile. Is that the best place to link up, or are you on other social networks or anything like that that you want to comment on?

Shashank: No, LinkedIn works best for me.

Brian: Excellent. Excellent. And just like any closing thoughts here? Is there anything I didn’t ask that you think our listeners should hear something about as a closing thought?

Shashank: There are a few things that we talk a lot, especially because, you know, this is what the Data Product Leadership Community, there are a few things that come to our mind that we often talk internally, as well as, you know, talk to clients, right? And few things, right? So, we’re saying that don’t just build data products; build data experiences. That, sort of, change the game. And if your data product isn’t driving actions—so you really build in those actions thing—it’s just another expense. So, probably a reason to [unintelligible 00:51:35]. So, just hyper-focus on those actions as well.

Brian: Excellent. Well, you’re talking to an experience designer [laugh] and consultant here, so I love hearing that you’re focused on the experiential part because obviously, it’s so tied to adoption, and adoption is so tied to business value, at least in my model. You can’t possibly create business value if no one’s using it.

Shashank: Yeah—

Brian: Right [laugh]?

Shashank: —exactly.

Brian: So, that’s great closing thoughts. So, thank you so much for coming on the show. It’s been great to talk with you, and I look forward to linking up all your resources here as well.

Shashank: Thank you, Brian T. O’Neill. It was great talking to you.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe for Podcast Updates

Join my DFA Insights mailing list to get weekly insights on creating human-centered data products, special offers on my training courses and seminars, and one-page briefs about each new episode of #ExperiencingData.