In one of my past memos to my list subscribers, I addressed some questions about agile and data products. Today, I expound on each of these and share some observations from my consulting work. In some enterprise orgs, mostly outside of the software industry, agile is still new and perceived as a panacea. In reality, it can just become a factory for shipping features and outputs faster–with positive outcomes and business value being mostly absent. To increase the adoption of enterprise data products that have humans in the loop, it’s great to have agility in mind, but poor technology shipped faster isn’t going to serve your customers any better than what you’re doing now.
Here are the 10 reflections I’ll dive into on this episode:
- You can't project manage your way out of a [data] product problem.
- The more you try to deploy agile at scale, take the trainings, and hire special "agilists", the more you're going to tend to measure success by how well you followed the Agile process.
- Agile is great for software engineering, but nobody really wants "software engineering" given to them. They do care about the perceived reality of your data product.
- Run from anyone that tells you that you shouldn't ever do any design, user research, or UX work "up front" because "that is waterfall."
- Everybody else is also doing modified scrum (or modified _______).
- Marty Cagan talks about this a lot, but in short: while the PM (product managers) may own the backlog and priorities, what’s more important is that these PMs “own the problem” space as opposed to owning features or being solution-centered.
- Before Agile can thrive, you will need strong senior leadership buy-in if you're going to do outcome-driven data product work.
- There's a huge promise in the word "agile." You've been warned.
- If you don't have a plan for how you'll do discovery work, defining clear problem sets and success metrics, and understanding customers feelings, pains, needs, and wants, and the like, Agile won't deliver much improvement for data products (probably).
- Getting comfortable with shipping half-right, half-quality, half-done is hard.
Quotes from Today’s Episode
- “You can get lost in following the process and thinking that as long as we do that, we’re going to end up with a great data product at the end.” - Brian (3:16)
- “The other way to define clear success criteria for data products and hold yourself accountable to those on the user and business side is to really understand what does a positive outcome look like? How would we measure it?” - Brian (5:26)
- “The most important thing is to know that the user experience is the perceived reality of the technology that you built. Their experience is the only reality that matters.” - Brian (9:22)
- “Do the right amount of planning work upfront, have a strategy in place, make sure the team understands it collectively, and then you can do the engineering using agile.” - Brian (18:15)
- “If you don’t have a plan for how you’ll do discovery work, defining clear problem sets and success metrics, and understanding customers’ feelings, pains, needs, wants, and all of that, then agile will not deliver increased adoption of your data products. - Brian (36:07)
Resources and Links:
Brian: Welcome back to Experiencing Data. This is Brian T. O’Neill. And today I’m going to talk to you solo again. The topic today is going to be about how enterprise data product leadership does not equate to doing agile.
So, I know agile is still new and particularly in the non-digital space, large enterprise, traditional companies are sometimes still at the beginning of adopting agile methodologies, and there’s also a lot of association with product management and product ownership and this whole topic of data products. So, I kind of want to uncouple these things and make sure we’re not equating the two because they’re not the same thing. And so, I just want to reflect on some experiences I’ve seen in the wild with clients of my own. And some traps you probably don’t want to fall into, as well. Some of this stuff has been—it took years, I think for software companies to figure out and a lot of those, I can say a fair number of those bumps had been worked out.
Some of them haven’t, so if you’re at the, you know, towards the earlier part of your agile adoption, hopefully, this episode will help you think about that, particularly if you’re in a enterprise data product leadership type role, you’re leading multiple initiatives, and you’re really in charge of ensuring customer value, business value, usability, utility, all those good things that we talk about on the show all the time. So, here’s ten ideas here I want to share with you. So, let’s dive in. The first one is that you can’t project manage your way out of a product problem. And this also gets into my thing about the initials PM, which get used in business quite a lot, and in some places, that means project management, and other places, that means product management. And what’s really scary is when organizations or teams sometimes think these are the same thing because the roles are quite different.
So, what do I mean by that? Well, leadership in data products isn’t about following the process and expecting to be lauded for following the process, whatever that form of agile is, or not, that you’re currently using. And I’m not saying project management, or project managers aren’t sometimes needed, but I’ve seen situations for sure, in large organizations where program management—which is really the centralized project management hub—have really taken over the entire product initiative. And so effectively, the quality of the product is measured by how well it’s following the agile process.
And that has nothing to do with customer value at the end of the day. So, it doesn’t mean that having people to help you stay true to a process in order to repeatedly push out good iterations of work, you can get lost in following the process and thinking that as long as we do that, we’re going to end up with a great data product at the end. So, it’s really important that you have a data product worth creating in the first place, otherwise, it doesn’t matter how well you carve that up into sprints, or your—if you’re using Scrum. It doesn’t matter how well you follow all the process if the product itself doesn’t matter because it’s not aligned with what customers want or need.
So, again, two forms of PM. Just remember project management and product management aren’t the same thing. The product manager’s job is not necessarily to project manage everything from a day-to-day standpoint. They’re obviously heavily involved with setting the trajectory, the strategy, the backlog of work, and all that kind of stuff, but if you’re just doing project management all day, you’re not really doing product management and you need to find people who really have a vision for the problem space and they love the problem space and they want to own that problem space and get a team together who has a shared sense of ownership over that. You don’t want people who just want to follow agile because they went to a boot camp or some training and they think, “Oh, finally there’s this way to build data products and software that’s like, you know, fast and it’s going to if we just follow it, we’re going to end up producing this great stuff.”
It doesn’t work that way and it’s not a substitute for having good research to inform what’s actually needed and to understand that the constraints of what’s actually technically possible, and to understand the design and user experience piece, as well. So, that’s my kind of my rant on PMs and PMs. [laugh]. So, number two, you know, the more you tried to deploy agile at scale, taking the trainings and hiring what I call the agile-ists, the more you’re probably going to tend to measure success by how well you follow the process. And I kind of dived into this one earlier, but I want to carve it out as a very as a specific one here.
So, the other way to define clear success criteria for data products and hold yourself accountable to those on the user and business side is to really understand what does a positive outcome look like? How would we measure it? What are the criteria by which we’re going to measure whether we did a good job when we look backwards in three months, or three weeks, or whatever the timeline is, how will we know if we did a good job? And that’s both for the customers and users, but also for the business stakeholders, if your stakeholders aren’t actually your users. Sometimes they are sometimes they aren’t. That’s the bar by which you want to be measuring success.
And it also will help you make sure that you’re not simply measuring agile progress metrics and how well you’re staying on onto the process, right? Process people tend to love process, but no customer cares about how good your feature estimates were and the fact you did a sprint retro and the retro was really productive and everyone got a lot out of it. That’s all great. But the customers don’t care, the users don’t care. They don’t care how the sausage was made. It just it’s completely irrelevant.
Their job—and typically in the data product space, we’re talking about informing better decisions, right, and so ultimately, did we or did we not help someone make better decisions with this iteration of work that we’ve put out, that’s the thing we’re measuring. And if they are making better decisions, then how are we measuring the business impact of those? That’s the business side of it, right? That’s probably more the direction you want to be measuring yourself, not on the sprints and how well you estimated, you know, the workload and the burndown charts and all that other stuff. The real point of agile was agility, and agility, especially when I seen the strategic, you know, the SAFe framework and Agile at Scale and all this kind of stuff, it’s really easy to actually see that agility is lost.
And following a rigid process, which really wasn’t the point so much, of agile is what overtakes the entire thing. So, sure, get training on agile if you need it, but that’s not what I do. And that’s there’s nothing wrong with that, but just remember, simply following the process doesn’t equate to putting out better stuff. It just provides a form of discipline and really something targeted for software engineering. That was really the focus of what agile was formed, originally.
So, we have to figure out how design and stuff fit into that as well. And that’s going to—talk about that a little bit more later. But just really be careful about what you’re measuring and that you’re not letting agile take over the success criteria of your particular initiative that you’re working on. Then you’re going to probably hear some rhyming going on here. So, number three is that agile is great for software engineering, but nobody really wants software engineering given to them, right?
They just care about the perceived reality of your data product. So, agile usually assumes that all design could happen within the confines of some implementation cycle, often called a sprint, for example. And this is a trap, in my opinion. You can spend the next decade trying to figure out how to fit user experience design work into agile, which it was never conceived of doing.
And I’m generally going to use Scrum and agile interchangeably. There are other forms of agile out there, but because Scrum or Modified Scrum and Customized Scrum is kind of what most teams are using, I’m going to use those interchangeably for this episode. So, you can probably leapfrog others if you don’t want to spend the same amount of time that many software teams have spent trying, and many of them are still trying to figure out this dance, where does the design and user experience work, particularly stuff that doesn’t seem like it fits neatly into short-term cycles, where does that fit in? That’s the question they’ve been asking. The most important thing is to know that the user experience piece, while it might sound a little bit hand-wavy, that is the perceived reality of the technology that you build first, the humans in the loop. That is it. Their experience is reality.
So, it doesn’t really matter what else it could do, or the way the data was modeled, or all this kind of back-end stuff. It doesn’t matter. It’s just their perception; that’s really the truth. And so, no matter how accurate the models are, how fast the engineering is, what when it’s the calculations and the data standards and blah, blah, blah, the outcome is all that matters. It’s not the outputs; it’s the outcome and the experience that people have with it that the customers are going to measure you by, right?
So, at a higher level here, I think the best framing I’ve heard about this is probably from Jared Spool, who talks about how do we do agile inside of UX work? And UX word meaning UX isn’t something that’s just owned by the design and user experience team; UX is the thing that the entire product team—which I’m usually talking about anyone that has a voice and making it, so your product management, your technical team, your engineering, data science analysts, as well as your designers, everybody is contributing to the experience, right? So, if we’re deliberately trying to design the experience, then a better framing of this is how do we do agile in our world of doing UX work, fitted in that way instead of the other way, which is trying to figure out how you do a little bit of design inside each sprint.
Again, this is still a tricky area, though, that even mature software teams are dealing with. I think it’s been, it’s definitely gotten better. There’s a lot more awareness now that user experience is integral, especially to commercial software projects because commercial software products know that they’re dependent on the experience and the utility, and the usability in order to stay in business. It’s much harder to ship something that quite frankly, sucks, or is too hard to use, or that you can’t see the value of and to actually stay in business. So, for them, it’s really integral that you have this.
So, leaping to where the software industry is, you can steal that mentality of working, if your team is really there to do the same thing, which is to produce data products that actually will get used and will be adopted and hopefully seen as indispensable, even if, you know, in the enterprise context, you’re not necessarily having paying customer—you know, the marketing team isn’t, quote, “Paying,” you a monthly fee, you know, like a SaaS business or something like that. There is still an investment that they’re making, and your team’s credibility and perceived value is obviously going to be wrapped up in whether or not your business partners and the end-users on those business teams are actually relying on you and using the solutions that you’re putting out, right? So, just know that the engineering and the data science and the modeling work, that’s not what anybody wants, and agile was really designed to do software engineering. It wasn’t designed to produce outcomes. And this is where things get a little bit construed sometimes.
And you also can see people basically just chopping up large backlogs of work into increments instead of iterations. So, always remember, the end customer just cares about their perceived reality and whether or not you generated an outcome, an improvement in their life somehow, and the business wants to see that as well. That’s ultimately how they’re going to measure the value of this thing, even if they told you what they wanted and they told you exactly, “We want it this way and we want version one to look like this, and this feature can go into version two.” They may have spelled all that stuff out, but ultimately, if the stuff you ship doesn’t produce that change that they’re looking for, it doesn’t matter. So, I don’t think agile as a methodology really framed that concept particularly well.
So, it’s really good at doing software engineering work. If you’re not necessarily responsible for the business impact and the user impact of the work, it’s probably fine. And I’m not a software engineer leader to even probably judge that. I’ve just seen that it provides a lot of good structure to teams and the teams get it and they seem to like working that way just to at a broad level, but as a data product leader, you have a responsibility that goes beyond getting it technically right. Your job is also to get it business right and to get it user right, as well, okay? So, just a reminder that agile is great for software engineering, but that’s not really what customers want from you. Okay?
So, the fourth one here is to run from anyone that tells you that you shouldn’t ever do any design or user research or UX work upfront because that’s waterfall. And I would probably extend this to, you know, something like a data scientist who kind of want to maybe poke around at the datasets and just get it get an idea of the landscape of data that’s even remotely available before answering a question like, “Could I build a model for this?” And I’m not going to talk too much about that because that’s not my particular area of expertise, but my point is that the full summation of all the work that may be required to deliver really innovative data products isn’t probably going to be fully contained within an agile process. Some people would disagree about this, but that’s my take on this. So, what we call design sprints, if I’m going to speak from, you know, my particular area of expertise, they have their place, and there’s different ways to approach this but I if you don’t—especially with a greenfield project or product, you don’t start that out by writing code, usually.
There may be some stuff, wiring, and plumbing that you know you’re going to need to build, but you don’t just start building a house with no plan, right? So, using, like, a boating analogy, right, it’s hard to drive the boat to the right place if you don’t even have a heading in mind. And determining the heading while you’re moving the ship often means well, we’ll steer the boat wherever the motor we’re building takes us. But the goal isn’t to just move the ship, right? It’s to get people to their desired destinations.
So, I’m careful not—again, not I don’t want to speak too much for the data scientists about this, but I think from my understanding of talking to them, they probably understand this as well and I’ve seen some well-written articles talking about how data science, particularly machine learning work, really should not be done—at least fully—contained within agile sprinting, at least within the scrum methodology for sure. Yeah, so I know, it sounds like oh, this is we’re back to upfront stuff and we’re supposed to ship a potentially usable increment work and all this kind of stuff. All I can tell you is, especially if you’re building a really strategic data product that needs to be useful and usable and you want to make sure that there’s a lot of potential work that could be done, serious technical lifts that may be required, you do not want to spend a lot of time and money building the wrong stuff. And this is why having some idea of the design trajectory, the user interface requirements, the user experience requirements, how is someone going to interface with this tool that we’re building at the end of the day, you don’t want to start totally greenfield by writing code and just taking guesses because as much as it’s gotten fast to develop software, it’s expensive and time-consuming to do that and you’re also going to build debt that you don’t want to necessarily throw away.
So, there is a place for doing some design work upfront. It will set a trajectory for the team. When we have an idea of how will we measure the success: what did the workflows look like? What’s the user journey look like? What’s, maybe, what a rough prototype? There’s definitely a place for doing that kind of work outside of the sprint context.
I have seen people structure the design sprints as sprints that have, you know, a some kind of defined criteria behind them. It’s usually a specific amount of time and all of that. I don’t necessarily mind that. I just don’t think that, again, following the process is not necessarily the point and so, we have to have a workable increment of code at the end of this is the only way that we’re doing agile correctly and therefore the only way we’re building the product correctly, I just don’t agree with that. To me, you need to have a heading and direction and strategy in place, even if that’s dynamic and that’s going to change as you get feedback from users along the way, you don’t start with nothing.
I mean, otherwise, what you’re doing as you’re doing the strategy work while you’re building. And no great strategy is going to come out of that because you’re going to make decisions based on what’s fast and easy to do and all the biases that come along with that. So, just don’t listen to this stuff. Do the right amount of planning, work upfront, have a strategy in place, make sure the team understands that, and then you can do the engineering piece within the constraints of the agile process if that’s what works well for your team. So, that’s my take on kind of this waterfall topic, or at least the upfront design, user experience, research work, and all of that.
So, number five, is that everybody else is also doing modified Scrum or modified fill-in-the-blank whatever agile process you’re talking about. So, main point here is you’re not particularly unique because you’re doing your unique thing. I’m not saying you’re trying to be unique. I’m just saying that everybody is modifying these processes to some degree to make it work for their team. I think that’s okay as well because, again, agility was really the goal here, not the rigid following of processes.
So, are we delivering stuff quickly and getting feedback from users quickly, and then reacting to that feedback quickly? That’s really what the spirit, the point of all of this agility was, this agile process. So, it’s okay. Find a way that works for your team and culture here. Just again, I’m going to say, if you’re working with sending project managers off to get their, you know, agile badge and to come back and bring that into the team, I think it’s just really important to know that the purest way that you may learn how to do that you’re going to have to relax some of that stuff, just like you do with anything.
You may have learned, like, here’s the way you really should build a model in school or whatever, and then you get to the real world and it’s a total mess. You know, you’re not even doing much data science work, you’re doing data wrangling work, and data engineering work and the actual modeling part is a small part of the job. And designers talk about this as well. You can learn all these different techniques and all these things you’re supposed to, “Do,” quote-unquote, and then you get into a real business context and you find out, “Oh, I’ve got to, like, advocate for the main parts of my job because some of that stuff we’re not going to have time to do on this version,” et cetera. And you’re doing a modified version of your particular skill set.
So, I think it’s just normal to assume that the team and the culture that you’re in, hopefully, your steering that, as a leader, you’re having some control and helping to steer that vision. But it’s okay to have a unique process. That point is, is it working well for us, right? Are we delivering iterations and getting them in front of users regularly and are we adapting to the feedback that we’re getting once we put people in front of this potentially shippable code that we’re releasing out there? That’s what matters and if your version of it is slightly different than company B’s, or your competitor, or what you saw at a conference, by all means, if it works, then roll with that.
I think everyone’s always worried about what is everybody else doing out there, and I’m not sure that really matters. It’s fine to study what else the market is doing and to learn, but it’s not about necessarily copying Google or Amazon or some other big company you heard about because they do it this way and so they must have it all figured out. There’s reasons to copy some stuff and other reasons that might not work out based on the culture of your organization and the maturity level they have with doing both design and engineering work, as well as of course, all the data work as well. So, use those coaches to keep you on the rails, keep you kind of on the right track, but the goal isn’t to become religious zealots about this, right? Again, the customers don’t care about what the process was that went into the project or the data product that you’re producing for them. It’s just it’s totally—that has nothing to do with the perceived value. So, modify at will. Make it work for your team.
Number six here—and Marty Cagan, who has been on the show, Silicon Valley Product Group, he’s awesome at this stuff, talks about, you know, problem-owning product teams versus feature teams—his point here, and I total—I agree with this, but while the PM may own the backlog—and I’m using product management here when I say PM—while they may own the backlog and priorities, the most important thing is an established set of product leaders who are problem versus tool, tech, or solution centered, right? So, when you’re trying to create a product-driven culture within your organization and maybe you’re going to hire some data product managers there, they shouldn’t necessarily own a technology, product, or solution. I don’t think that’s the right way to look at it. I think they should own some problem space, and then they should assemble the right teams, technology, solutions, and set strategies that will help solve those problems.
And not necessarily, like, “I’m the Hammer Product Manager, and so I just hit things with hammers because that’s what my assigned title was.” That is not probably what you need. And if it is, I’m a little worried because if you’re going to try to have a dedicated PM for every tool that you have, good luck making that scale, especially when user problems span multiple different tools, applications, workflows, the workflows are complicated, they may not sit nicely within one tool. Like, I don’t think you want to be structuring the team that way. And actually, I’m going to share an email with you that I got from a product manager at a SaaS company whose tool I use in my music business when running my ensembles.
And this person contacted me via email—this product manager—and said, “Brian, in case we haven’t met yet, my name is so and so, and I’m the product manager in charge of integrations at company name. I noticed that you did not have any CRM integrations connected to your account. I’m planning now the features to be developed in Q3 by my team and I thought I’d ask you just one question to make sure my ideas are aligned with your needs as possible. What native integrations are you missing? What do you lack integration-wise in our product?”
And so, the key words I want to pull out of this email from this person is the quote, “Planning features,” and the work that this person is doing to try to—it sounds like I need to give some work to my engineers to do. And yes, planning the backlog is something that product manager is going to spend a lot of time doing, but planning features really isn’t the core idea of product management, right? And so, I do love the fact that this person is engaging in direct inquiry with a user—which is me in this case—to inform their work. They’re actually going out to the wild and they’re one on one trying to understand my needs there, and so kudos to them for doing that. But the goal isn’t to fill the team’s plate with stuff to do, right?
That’s not really a customer-driven focus. I mean, the assumption here is that I am missing native integrations, and so tell me what integrations I should build. Well, it’s like, I don’t know. I mean, maybe I could tell you what integrations you need. I actually don’t need any and I actually do have CRM integrations hooked up. I just use Zapier to connect those things up. And I didn’t get a reply; I mentioned this to this person and I didn’t even get a reply about that.
But the point is a conversation there about why would I even need integrations in the first place might shed more light than simply asking me which integrations I need because that supposes that I know that an integration will solve the business problems I have running, booking, and selling my music ensembles. And I don’t know, maybe there are some integrations that would help with that. But it’s the product manager’s job to ask good questions that would tease out the need for integrations or to potentially not to find out that more integrations aren’t needed, but instead, “Oh, Brian has this much bigger pain over here, and I’ve heard five people now mention this to me on my calls.” But this person said, they’re the product manager in charge of integrations. And that’s okay, maybe they would pass that feedback on and say, you know, what, we talked to ten people, and seven of them actually don’t need any integrations right now, but I’m passing this along to Product Manager B, who’s in charge of say, onboarding, oh, the onboarding sequences really bad or some problem set that they own, that’s a bit different.
And maybe that’s what’s happening here. It doesn’t sound like it and my only fear is that with this kind of approach, the assumption is I’m going to build integrations because that’s built into my title. Maybe that’s the right thing, but I think that’s too narrow of a focus on how to staff your product organization within your enterprise, I don’t want you to have too much tooling or assume solutions built into people’s titles. It’s like having a machine-learning product manager. Well, that assumes that all the solutions will require machine learning.
Well, maybe they do and maybe they don’t, but maybe owning a customer problem like churn. My job is to reduce churn and the organization. That’s what our team works on. Which sometimes requires machine learning and maybe sometimes it doesn’t. Something like that is probably more the spirit of what I’m talking about in terms of developing good product-centric methodologies and mentalities within the organization to get them really focused on customer need and not on the solution space so much.
If you have a large enough staff, maybe you can cut it up that way because the product team is so good that you can have the machine-learning product manager because they know when to pass the work off to someone else and say, “Hey, there’s this great problem space here. It doesn’t require machine learning. It really just needs, like, a dashboard, or some traditional analytics or something like that. There’s really no ML to be done here. It’s unnecessary, it’s expensive, and just there’s no reason to do it that way. Pass it on.” Fine.
That takes a pretty mature product person, though. And so, if you don’t have really good product people that have done product before, I would be really careful taking that kind of approach, at least initially, here. So, that’s number six.
Number seven, before agile can thrive, you’re going to need strong senior leadership buy-in if you’re going to do outcome-driven data product work. So, why does this matter? Why am I saying this? Well, because you’re going to be dependent on IT, your customers, your stakeholders. Like, if you’re going to try to really build human-centered data products that actually get used, you’re going to have to go talk to them, and you’re going to have access to them.
And if you don’t, there’s a good chance you’re going to strike out. And the same thing with IT. And I’ve been doing interviews with VPs of data science and technical product management and I was asking them about what some of their biggest challenges are. And the slight dependency and IT, particularly in the data science place, is a huge challenge. It’s, “They work really slow. They’re very risk-averse.”
And so, even if you can say, “Well, our data team is doing product,” quote-unquote, product is also a lot about forming the relationships necessary to get good data products done and all the dependencies that also go into that, right? So, you need the senior leadership buy-in because you need to create an environment that’s going to be suitable to allowing users and stakeholders and makers to all work together. Sometimes that requires some investment and interaction from the top to say we’re going to be doing things a new way here. We need to make time for these kinds of things. Like, the priorities may need to change.
Maybe you need to the culture needs to start moving more towards a yes culture instead of a no culture. This is I think, a problem in very large organizations that are really risk-averse, is that no is a much easier default answer. Let’s not take any possible risks here, and the CYA mentality—cover your ass mentality—and all this kind of stuff, this is why you need some senior leadership buy-in, I think, if you’re going to try to do this. And you can start small. I think that’s okay, but you’re probably going to find out about these dependencies if you don’t fully control the entire technology stack, all the access to the users.
If your team—and if you can, that’s awesome. If you can really run a team that has full access to what it needs and it’s not really gated by any external blockers or anything like that, that’s awesome. Go for it. I just don’t think that’s probably going to be the case in a lot of places, and so you’re going to need to get stakeholders on board there. So, agile isn’t going to solve politics, communications issues, the aversion to risk that a lot of organizations have, and the fact that honestly, if you’re new to doing this, part of innovation means you’re learning something because, by definition, you’re not innovating if you’re simply doing proven tactics and methods that worked yesterday that are going to probably work today, that’s not innovation. That’s just repeating yesterday.
And sometimes that’s fine, but if you’re going to be doing new work, and solving new problems with new methodologies, you’re going to be trying new things that may or may not work out. And that learning is very important to do, but there needs to be a culture in place to support that learning and to understand, some of this stuff isn’t going to go right. We’re going to—maybe we’re going to blow some stuff up; maybe we’re going to break some things; maybe we’re not going to ship anything useful at the end because we have no idea how long it takes to do anything because we’ve never done any estimates before, and we’re terrible at estimating, which guess what? Everybody is when they first get started. So, building in the fact that this stuff isn’t necessarily going to go smoothly, is important. It’s important to set those expectations with the most senior leaders in the organization if you’re going to go down this transformation stuff.
I also think too, that gets into, like, the funding thing, right? I think it’s different to fund projects versus funding a problem space that’s probably latent and ongoing, which could be something like, again, customer churn or spending our marketing dollars efficiently, right? That never really goes away. You’re never probably going to get to the point where, “Oh, we’ve now done that. There’s no way for us to be more efficient with our marketing spend.”
That project, well, it’s too dynamic of a problem, and so funding that problem space at the top level is probably going to be really important if you’re taking this product mentality because the product mentality is one where there really isn’t a defined endpoint. You’re working and living on a continuum. Your product, the results of your work, the interface, the applications, the reports, or whatever it may be, they’re never done. They’re just on a continuum that’s hopefully constantly improving over time. And of course, someone does need to set those priorities, and it means you’re not necessarily to have endless amounts of money all the time, but there is a mind shift here, that well, we’re going to put this much dollars on it to do this project with the assumption that a project isn’t necessarily going to solve the problem.
That’s where I think teams have seen problems before because what usually happens is they end up delivering the features and the implementation detail that was asked of them by the requester, the project sponsor, and the assumption is if we just give them what they asked for then we’ll get a gold star on our report card and everybody will be happy. And unfortunately, what we see is that a lot of times the stuff doesn’t get used, it doesn’t create an impact, and it definitely doesn’t create business value if it’s not getting used, right? So, the senior leadership buy-in is going to be really important if you’re adopting both agile and this outcome-driven data product work that’s not project-oriented. Okay?
So, number eight, there’s a huge promise wrapped up in the word agile. So, you’ve been warned. And this kind of ties into what I was saying earlier about getting that leadership buy-in, but just know from a marketing standpoint, if your senior leadership has heard about agile, it sounds great. It sounds like finally, everything we always wanted to make, but now it’s fast. Now, we get it faster, and it’s agile.
What’s the switching cost of doing that? What’s the risk of doing that? What’s involved with doing that stuff well? This whole idea of shipping, quote, “Half-baked work,” or, “Half a product?” Well, what is even a product when we’re talking about internal tools driven by data science and analytics? What are we even talking about?
A lot of organizations don’t know. They haven’t actually faced those questions and they’re definitely not used to shipping stuff before it’s done. So, there’s a lot of cultural change that may need to happen. But just know, the sales promise of agile, it sounds really good, so I think it’s important that if you’re doing this, you know, agile product transformation thing within your organization, that you’re setting expectations about, what are we actually really going to get? And what’s the learning going to look like? And what’s our fumbling through it going to look like?
I would want to be setting those expectations appropriately so the business knows what to expect because it’s a journey. And part of agile is like having the retros and, getting better at doing agile and getting better at doing estimating. The assumption is, you’re not going to be very good at that stuff from the get-go, so it’s kind of nice, that mature agile has some really great stuff in it about learning how to get better at doing agile itself, not just building the product. So, make sure you’re communicating that stuff properly and setting those expectations.
Number nine, if you don’t have a plan for how you’ll do discovery work, defining clear problem sets and success metrics, and understanding customers’ feelings, pains, needs, wants, and all of that, then agile will not deliver much improvement for your data products, most likely. And I’ll say, you might get lucky—and I’ve talked about this before—sure, you might get lucky a few times, but if you’re a leader and you have a team and you want to routinely have repeatable processes for putting out good solid work, then you don’t want to be relying on getting lucky once in a while; you want to rely on a process that’s repeatable, that something you can teach to new people when they come in. You need to have a recurring methodology for building really great stuff. So, I think it’s important if you’re going to introduce agile, particularly because it was a software engineering methodology, if you listen to this show, you probably do care about user experience and you understand that the last mile has a lot to do with the business value. Because if people aren’t willing or can’t use it, there won’t be any business value.
So, this is a really good time to also discuss with your team, how are we going to design the user experiences around here going forward? When will we do the customer research? How will we do it? When will we do this mushy problem-finding work and synthesis work and trying to figure out what’s actually needed so that we’re not just responding to what people ask us to make for them? This is all the messy part of design work, and it’s normally messy.
And it’s supposed to be messy and somewhat undefined and somewhat gray. This is totally normal, but I think it’s good to figure out, how does that going to fit in? How are we going to make all these different processes and work dance together well? Yes, you can iterate—and agile talks a lot about iterations, right—but as soon as you go down a path and do the second iteration, you’re debt builds. And a lot of teams aren’t mature enough to toss things out that have had, you know n number of sprints or investments put into them already because it feels like the ability to change is just around the corner again. We can just completely change course again.
And of course, there’s some real costs associated with doing dramatic changes there. And so, part of the idea here was, well, you’re shipping smaller amounts of work sooner so you don’t have as big of a pivot to make here. But what I see in reality, though, is that a lot of teams are good at doing incremental data product design, but not so good at doing iterative data product design. And ‘incremental’ and ‘iterative’ are not the same thing, right? Incremental is like we’re building a car and we’re starting with the wheels, then we’ll build the radio, then we’ll build the air conditioning, then we’ll build the hood.
And so far, you still don’t have a method for transporting human beings from point A to point B in a safe manner. If that’s our definition of really what the outcome is that we want, a method for more rapid transport of human beings from point A to B, that incremental method is not a solution. You still have not produced any viable solution for the user. The iterative approach would be like, let’s get the most basic bare bones of wheels, and there’s no roof yet, but we’ve got a seat and we’ve got a drivetrain and some wheels. We have enough that if they get rained on, they’re going to get wet, but they can at least get from point A to point B. And it only goes five miles an hour, but we do have an iteration here that is potentially useful to users.
I’m using this analogy because a lot of teams just take the waterfall approach, they take, here’s this giant thing we’re going to make, and here’s the model at 40% accuracy, and here’s the plan to get it to 90% accuracy. And it’s really a feature-driven methodology. They’re just slowly adding on more. You’re taking a little bit, adding more, and then adding more, and then adding more and adding more. And when we get bad feedback on it, maybe we tweak it a little bit, but we rarely erase. We rarely remove entirely. We rarely go back to the drawing board when we find out we’re really not hitting the nail on the head here and we need to start over.
That takes a very mature organization with good leadership that’s ready to say, “You know what? It’s okay to throw stuff out that’s not working, let’s catch it soon.” I don’t think a lot of places work like that. They don’t want to hear that it’s three months behind and that it’s not really on track and everyone goes into cover their ass mode and all this kind of stuff. That’s incremental work, and that’s agile done in the way it really wasn’t ever intended to be done.
And a lot of teams always forget that even though agile said you’re supposed to have this user feedback built into all the loops, a lot of places don’t. You know, they assign a quote, “Designer,” or they assign someone to be in charge of representing this user, and not actually going out and getting the stuff in front of actual users. And so, it becomes checking the box about following the process. And again, as you know, from this episode, following the process is not the goal. That’s not what leaders are here to do is to simply follow processes, right?
So, watch out when it’s like, “Oh, the dashboard has two charts, and then we’re going to add a third chart, and then we’re going to add a fourth chart, and eventually we’ll have all 12 KPIs on it.” Well, maybe. Maybe that sometimes makes sense. But that’s the kind of thing that sets off a red flag for me because it says the team just assumes that by adding more, we’re increasing value, utility, and usability, and that’s not always how it works there. Okay?
So, getting to the end here, number ten. The final one here is, you know, getting comfortable with shipping half-right, half-quality, half-done is hard. And I think I mentioned this earlier when we talked about getting ready for change and setting expectations with senior leaders and all that, but I think it’s worth calling this out as its own item on the list here. So, one thing it just in my own business, and I’ve gotten very comfortable with this, is shipping, like, even this episode, like, I wrote this email last night to my list and now I’m doing the podcast episode about it. I wanted to go into a little more detail about it on here.
Is it perfect? No. Is it fully researched? No. It’s my opinions, my experience, what I’ve seen happen in the wild. Does it have mistakes in it? Sure. Is there typos? Probably. It’s not fully baked, it’s not perfect. But getting it out, trying to get feedback, and in my case, trying to help you learn something that you can apply to today’s work and maybe prevent you from having to make a mistake that someone else has already learned, if I’ve done that, then I feel like I’ve done a good job here.
And getting comfortable with shipping it before it’s perfect, it’s really hard, especially if you have perfectionist tendencies, and I’m one to claim that I do. I’ve had to get really comfortable with that in my writing and my podcasting. And I think it’s okay. And so, I love—you know, when I see a design team that’s showing me really rough sketches of stuff and the interface probably looks really wonky, but I can tell there’s a clear strategy and they really understand the customer problems and they were willing to cut some corners on, you know, aesthetics, or maybe the interactions aren’t super-smooth, or the data team, like, cut out some additional insights from the dashboard and they cut it down really simple, but they had a reason and a strategy for doing that. And it doesn’t look necessarily great. Those are all great signs that you’re doing it right, actually.
It’s really easy to just look at a super-polished thing at the end and talk about it there, but the messy truth is that, again, we’re on this continuum here, and along the way, we’re shipping incomplete stuff. And we’re getting incomplete stuff out there so that we don’t spend more time building the wrong thing. The goal is really in getting that real user feedback because ultimately, when we’ve all seen it before, you can hear about stuff, theoretically, and you can even do your user research and you hear about stuff, but until people actually get their hands on it, the feedback may or may not be entirely informed. And so, that reality—and remember, perception is reality, so the reality of the quality of the data product is defined by the experience people have with it. And we won’t know what that is until we get people in front of it.
So, my point is, get them in front of it as soon as possible and make sure you’re solving the problem level and not at the feature development level. And if you’re organizing your sprints and your backlog and your work around this idea of problem sets with defined success criteria and not around features, you’re probably on the right track to doing a really great job and having a great enterprise data product leadership organization or team on your side. So, that’s kind of my take today on you this whole topic of, kind of, agile and data product leadership.
So, I’m also, as you probably heard on the show, also starting at data product leadership community. You can check that out on my website, just go to designingforanalytics.com. You’ll see a link to the community up there in the nav bar. There’s an announcement list on there, so right now, you just stick your email in there.
There’s nothing to buy right now or anything like that. I’m still putting this together, but I’m trying to create a place both for people in the software industry, but also for people in enterprise data science and analytics who are adopting this product methodology for building data products. I think these teams can learn a lot from each other. And whether you come out of the data science field or the engineering field or the design and user experience field, if you have that title in that responsibility there, I think it’s great that we’ll have some different domains sharing knowledge with each other. Because ultimately, that’s really what product is about is this dance between, you know, typically, it was what I called the Power Trio: design, technology—which usually meant engineering—and product management.
This was kind of the Power Trio. And with data products, I think, usually, there’s a fourth leg of the stool, which is the data science or analytics analyst's role. So, join that. I’ll keep you posted when the community goes live. And until then, you can also hop on the mailing list if you want to get articles like this in written form instead of waiting for it on the podcast that’s over at designingforanalytics.com/list.
And until next time, talk to you later and keep putting out stuff that gets used, that’s useful, that’s valuable, and it’s focused on that last mile. Ultimately, that’s what the game is about. It’s about what happens in the last mile when customers use these data products that we’re creating. That’s where business value happens. So, focus on that last mile and keep plugging away. Thanks.