024 – How Empathy Can Reveal a 60%-Accurate Data Science Solution is a Solid Customer Win with David Stephenson, Ph.D.

Experiencing Data with Brian O'Neill (Designing for Analytics)
Experiencing Data with Brian T. O'Neill
024 - How Empathy Can Reveal a 60%-Accurate Data Science Solution is a Solid Customer Win with David Stephenson, Ph.D.

David Stephenson, Ph.D., is the author of Big Data Demystified, a guide for executives that explores the transformative nature of big data and data analytics. He’s also a data strategy consultant and professor at the University of Amsterdam. In a previous life, David worked in various data science roles at companies like Adidas, Coolblue, and eBay.

Join David and I as we discuss what makes data science projects succeed and explore:

  • The non-technical issues that lead to ineffective data science and analytics projects
  • The specific type of communication that is critical to the success of data science and analytics initiatives (and how working in isolation from your stakeholder or business sponsor creates risk))
  • The power of showing value early,  starting small/lean, and one way David applies agile to data science projects
  • The problems that emerge when data scientists only want to do “interesting data science”
  • How design thinking can help data scientists and analytics practitioners make their work resonate with stakeholders who are not “data people”
  • How David now relies on design thinking heavily, and what it taught him about making “cool” prototypes nobody cared about
  • What it’s like to work on a project without understanding who’s sponsoring it

Resources and Links

DSI Analytics Website

Connect with David on LinkedIn

David's book: Big Data Demystified 

On Twitter: @Stephenson_Data

Quotes from Today's Episode

“You see a lot of solutions being developed very well, which were not designed to meet the actual challenge that the industry is facing.” — David

“You just have that whole wasted effort because there wasn't enough communication at inception.” — David

“I think that companies are really embracing agile, especially in the last few years. They're really recognizing the value of it from a software perspective. But it's really challenging from the analytics perspective—partly because the data science and analytics. They don't fit into the scrum model very well for a variety of reasons.” — David

“That for me was a real learning point—to understand the hardest thing is not necessarily the most important thing.” — David

“If you're working with marketing people, an 80% solution is fine. If you're working with finance, they really need exact numbers. You have to understand what your target audience needs in terms of precision.” — David

“I feel sometimes that when we talk about “the business" people don't understand that the business is a collection of people—just like a government is a collection of real humans doing jobs and they have goals and needs and selfish interests. So there's really a collection of end customers and the person that's  paying for the solution.” — Brian

“I think it's always important—whether you're a consultant or you're internal—to really understand who's going to be evaluating the value creation.”— Brian

“You’ve got to keep those lines of communication open and make sure they're seeing the work you're doing and evaluating and giving feedback on it. Throw this over the wall is a very high risk model.” — Brian


Brian: David Stephenson is the author of Big Data Demystified, and he’s a consulting data strategist primarily working out of Europe. I’m excited to have my chat with David on the show here today because I think David has a really good way of connecting a highly technical background with the business and kind of act like a bridge between these two departments.

Brian:    All right. David, are you on the line?

David:  Yes, I'm here.

Brian:    I'm super excited to talk to you. So we have David Stephenson, the consultant and author of Big Data Demystified. we met through, somehow we met, I think it was on LinkedIn originally. And you also run Predictive Analytics World Conference, in the UK edition, correct?

David:  Yeah, that's right.

Brian:    Awesome. Tell people a little bit... I gave you an intro there, but it's always better from the horse's mouth. So tell, tell us a little bit about what you're what you're up to.

David:  Yeah, sure. Um so I used to run global uh business analytics for the eBay classifieds group and then uh started off on my own a couple of years ago and have been doing consulting uh mostly around Europe in uh data strategy, data science program developments. And then recently uh published a book also on the topic uh with Financial Times Press called Big Data Demystified. I was just doing a lot of work between speaking and teaching and training and consulting with companies, really focused on uh developing how they use data and analytics and helping them reach the next steps in that. So it's been keeping me, keeping me quite busy recently. Really enjoy it.

Brian:    Nice. And how did you get involved with the... I don't know if, do you guys say PA or predictive analytics?

David:  Yeah, yeah. Predictive analytics world. Yep.

Brian:    How did you get uh involved in that?

David:  Yeah, I'd been speaking at quite a number of conferences and the um the PA conferences were some that I was speaking at. And I'd spoken with the organizing company, Rising Media. I'd spoken on three of their conferences and at some point they contacted me and said, "You know, we've enjoyed having you speak with us all this time and we're looking for a new conference chair for the London uh predictive analytics world. Are you interested?" And uh after you know so many years of going to conferences and thinking, what would I do better? What would I do better? I thought this is finally my chance to, to do everything that I've wanted to do to make conferences better. Uh yeah, so I said yes. So I've been doing it now, I'm going on my second year.

Brian:    Nice. And like could, even though we're probably not going to go way deep into PA, but what did you want to make better or what have you made better? Like what was wrong and that you wanted to change?

David:  I was frustrated with the combination of the choice of who would be on stage.

Brian:    Mmhmm.

David:  So there's a lot of emphasis on just let's get you know a well-known company or someone who just has the name recognition without really focusing on quality contents. For me as a practitioner, I really wanted to make sure, look we're going to have two days to select speakers. Let's get speakers up there who are really going to give valuable content. Because I was frustrated in going, I'd go to so many conferences where out of you know out of 15 talks there'd be one or two that I thought were valuable. And I thought, okay, I'm going to get a conference going where almost all the talks are really high valuable talks.

Brian:    Mmhmm.

David:  So that was for me the big opportunity.

Brian:    Make it the conference you want to go to kind of thing.

David:  Exactly, yeah, exactly.

Brian:    Be the change you want to see in the world as they say.

David:  Yeah, no it was a good opportunity. It's been a ton of work, but um yeah I've enjoyed it.

Brian:    Excellent. Excellent. Cool. So jumping in, I have a ton of questions for you. I hope we can we can fit them all in. Uh you came recommended to me um from a previous guest and friends and I was excited to talk to you. So I also notice you know in your work you practice design thinking. So I want to jump into that in a little bit. The first thing I wanted to ask you though is, um there's so there's a lot of technical knowledge required to do you know data science and analytics effectively. Can you talk to us a little bit about the nontechnical knowledge that's required to make solutions that are obvious, they're usable, useful value is created, that kind of thing. What's missing here? There there's a high failure rate in this industry for projects. Talk to us about these nontechnical skills that are also required.

David:  Yeah, I know that's a great question and I think that's where we're having a lot of trouble when we look at the industry, right? The stakeholder management is really difficult because, you know as you're a technical expert, you're really focused on how to do these you know very technical, whether it's an analytic model or some type of software design or software implementation. But so much of what we do in a business environment, in an enterprise environment is understanding the stakeholders, understanding what the real challenge is, and then also communicating with them throughout the process. Especially at the beginning with this design element where you're really understanding what's happening and your starting to produce the right solution. And you see a lot of solutions being developed very well, which were not designed to meet the actual challenge that the industry's facing. So yeah, that's a that's a huge thing.

Brian:    You don't have to state any names or anything, but can you give an example of a... You said something was being designed well, which I assume means technically it was being designed properly, but it wasn't communicating its value. Its value wasn't inherent, uh inherently obvious to the consumer. Have you had a situation where you've come in and it's like, whoa. And then like what would that before and after kind of look like?

David:  Yeah, no, there's quite a bit quite a bit of that. Um like for example, you've got, I had one situation where the customer said, "Look, we need a certain solution to advise on how to change a product. You know we have limited resources for enhancing a product in different areas. Can you advise-"

Brian:    A digital product?

David:  No. No, actually um a physical product.

Brian:    Okay.

David:  Um and so they said, "Look, we've got limited resources for enhancing these." And then you know you'd start to work on this or you know third parties would start to work on this challenge and then they'd deliver it maybe two months later and say, "Here's our recommendation of what to deliver." And then you'd speak with the person who would actually implement that solution and they'd say, "Oh, I didn't mention to you that you know here's an additional limitation on which products we actually physically can improve." And you just have that whole wasted effort because there wasn't enough communication at inception.

Brian:    Throwing it over the wall kind of model.

David:  Yeah, yeah.

Brian:    Like let us know when it's done.

David:  Right, right.

Brian:    Yeah.

David:  And then they end up delivering something which wasn't practically usable.

Brian:    Got it. So how, how do you approach this with... You know you hear this repeated problem, which is a lot of places want to jump into using advanced techniques you know or the business stakeholder might request machine learning or request AI. And so, and then maybe there is an opportunity to actually use that tool for a good reason. But then it's like, "Oh wait, we don't have the data infrastructure in place to do any of that." So like level one is like now we have to create the pipelines and all this kind of technical stuff that needs to be built out. How do you approach, because you're talk... I know design, so this is preloaded with the fact that you know something about design thinking and and that you probably know about moving in small increments. So how do you how do you show value soon and maybe work in a small increment where progress can be seen when there is maybe a large technical requirement that needs to be there before any of the quote useful cool stuff can be delivered. How do you do that?

David:  Yeah, no that's great. And that's super important to show value soon. Um and again, I mentioned this also like in one of the chapters in my book and there's a couple of reasons for that. You know one is because um the more often you cycle back to your stakeholder, the more closely aligned you are with their needs. But also if they haven't communicated clearly to you and you start to give them intermediate results, that gives them a chance to refine what they want and to clarify that and to kind of keep thinking with you, right? Uh so you really want to get back to them as quickly as possible with small results, right? And there's different ways to do that. Typically what you do is you take a small sample of the data or a small subsection of the challenge.

David:  Uh for example, instead of covering you know the whole world for a company, you know all the different markets. You say, "Okay, I'll take this one small market, um you know this one European country in this one product. You know two weeks from now I'll give you an estimate of what your solution would look like in this one limited area." And then you do that and then you get some iterations and then, then you say, "Okay, we'll do a pilot stage from the proof of concept." And go to a pilot stage, where you'll say, "Okay, I'll give you the full solution just for this market, but it's the full solution." And then you iterate into kind of finally production for the full range and such. But this iteration going from proof of concept to pilot to full deployment to automation uh is a way to kind of step through that process.

Brian:    Yeah. This kind of reminds me of, I think it's the thirties Jason Freed quote, like build half a product, not a half assed product.

David:  Yeah.

Brian:    And also this concept of agile, right? Which you know I saw this really well illustrated one time and it was like, it was a picture of a car from an agile perspective and like stage one of the car was like the Flintstones car, right?

David:  Mmhmm.

Brian:    And then stage two was the automobile we picture in our head. The non-agile way of being a car with no front wheels. Like it's

David:  Many axles no front wheels.

Brian:    Wheels... But it's all polished. It looks like a modern automobile, but it's not, it's not a working, there's no value there, right? You can't actually drive and you can't transport yourself somewhere with that. So that's not, you know anyhow. So that kind of reminds me of what you're saying there. Showing showing some value on a small scale. Um you know-

David:  No and I think that companies are really embracing agile, especially in the last few years, right?

Brian:    Mmhmm.

David:  They're really recognizing the value of it from a software perspective.

Brian:    Mmhmm.

David:  But it's really challenging from the analytics perspective

Brian:    Mmhmm.

David:  For a couple of reasons. Partly because the data science and analytics, they don't fit into the scrum model very well for a variety of reasons.

Brian:    Mmhmm.

David:  And the other thing is, uh people who really love this field, they want to do the cool work, you know they want to do deep learning, they want to do advanced models.

Brian:    Right.

David:  And when you tell them like, "Hey, just give me you know a really super simple data driven model that you know meets 60% of the solution." Um a lot of these guys who are just focused on machine learning, that's not what they want to do. They want to jump straight to the cool stuff.

Brian:    Mmhmm.

David:  So I find in the projects that I do at the beginning, a lot of it is just kind of talking people down and saying, "Look, let's start simple. Let's not boil the ocean at once. Let's not start with the most advanced models." For some people that takes a bit of convincing.

Brian:    Is that cultural at all? Is it tied to the amount of academic background they have? Like do you see a pattern that goes with that or is it just a seniority thing? I hear that tends to be more at a more junior level people that are closer to being out of school. Do you see a pattern or trend there?

David:  Yeah, no that's a good question too. There's two things. Um sadly, part of it is simply that uh a number of people I've talked to would say, "I only want to do interesting work."

Brian:    Mmhmm.

David:  You know their goal is not to first and foremost bring business value,

Brian:    Right.

David:  But they want to do work that's interesting for them. And for some people that's simply it, and as a manager, as a leader, you have to get people in place who are aligned with your business needs.

Brian:    Mmhmm.

David:  Because not everyone with the skills is. You know? And some of them will be very forthright about it. They'll say, "Look, I'm not interested in that work because it's not interesting." And the other thing is the seniority. I think the longer people are around, the more value they recognize. Two things. One is from the basic models. You know just there's so much work that's, so much so many projects that'll work fairly well with just a simple regression or Bayesian model. And the other thing is just taking the time to say, "Hey, I'm you know I'm going to do the simple iteration uh and not spend three months developing something before I deliver it."

Brian:    Yeah, I can understand that. So I'm curious, do you feel the solution when that kind of problem happens uh is to train and assist those technical people in learning how to broaden their skill sets so they realize you know value was not entirely driven by the technology portion? Or is it like you really just need to bring in a different role entirely, like or meet in the middle somewhere? Like how do you address that?

David:  I think at a leadership level you have to find the leaders who already understand that. Um and then the people who are more junior, those are the people who need to learn it. I mean, I'll just give you a basic example. I mean in years past I worked in quantitative finance, right? So we would um we would value these financial products like interest rate swaps and you know foreign exchange swaps and such.

David:  And I remember valuing some of these and there would be two parts of it. One was the foreign exchange and one was the interest rate for uh cross currency swap. And the one part was super hard to do and the other part was really easy to do. And I'd get stuck on the really hard part. You know I'd be focusing all my time on that. And at some point my manager who was an MBA and not super technical you know came to me and said, "Look David, you know this, the simple part that's 98% of the risk, you know and the complex part you're focusing on, that's only 2% of the risk. So you know it doesn't matter if there's a huge error with that part, just get the simple part right and we'll essentially get most of the risks um quantified."

Brian:    Right.

David:  So that for me was a real learning point.

Brian:    Mmhmm.

David:  To say, okay, understand what's the hardest thing is not necessarily what's the most important thing.

Brian:    Right. Not to mention most places are happy with a 98% test score.

David:  Exactly. In this area, right, right.

Brian:    We'll take 75 actually.

David:  No, and that's the thing, and that's important to know you're uh you're stake owner too because you know if you're working with marketing people, you know an 80% solution is fine. Um if you're working with finance, you know these are the guys who really need exact numbers. Um so really you have to understand what you know what your target audience needs in terms of precision.

Brian:    Yeah. Yeah. Tell us a little bit then about, uh so I brought this up earlier, the concept of design thinking and how it can apply to you know data science and analytics work. How are you applying that in your in your consulting work?

David:  Yeah, no this is something really I'm focusing more and more on now because you'll see this... Gartner has a nice visualization where they show design thinking, uh iterating into um uh the prototype phase and then iterating into the production phase. And the idea is that um you know we all sort of know, have proof of concept is important the prototype before we go into production. So we've all recognized that. But what we're really missing a lot is, is that initial stage, where you're this design stage where you're saying, "Look, what is my real business challenge um and what's the solution that's really going to address that?" And taking time to get the right people in the room to get the right processes in place so that you're starting down the right path even before you start to build a prototype. Because just from experience, I've seen this where, you know I was all proud of myself a few years back because I made this awesome prototype

Brian:    Mmhmm.

David:  You know of something very quickly, very powerfully.

David:  And I thought, "Oh, this is great. I've done such a great job with this." And then by the time we went to deploy it, a few months later, they shut it down because it wasn't meeting the need of the company. And I thought to myself, you know if I'd really taken the time um rather than just going with what the company asked me to do, because you know the CTO had said, "Let's build this." And so I was like sure, let's build it. It looks cool. If I'd instead stopped at that point and said, "Look, let's first go through the design process," and figure out what we really should be building before jumping straight into the prototype. We would have saved a lot of time and effort.

Brian:    Yeah. One thing I fully agree with this, this process. Have you ever tried using, um you know for example if you're building a predictive model for something, is prototyping something out that doesn't even use data like and I'm particularly thinking about solutions where there's going to be some type of visible user interface here, but actually using mock ups of the of the design in order to tease out whether or not the intended end user might use this. How are they going to react if they see a number that they don't expect or it's unbelievable information so that you can figure out how might we need to present this in a way that they will actually believe it if the real data actually ends up generating these kinds of results so that you can plan for that kind of contingency ahead and you don't end up you know with the head scratch or the unbelievable reaction. Is have you worked that approach before?

David:  So that I haven't done in terms of you know testing the end user's response.

Brian:    Uh huh.

David:  There's definitely this method, you know the smoke testing where you put a feature out that doesn't really work just to

Brian:    Right.

David:  To see what kind of leverage it generates and such. And we've done other work where instead of using a model, we used user responses to gather information

Brian:    Mmhmm.

David:  In various ways. There's several ways we've done that. Um but just what you're describing now, I don't think I can't think of a case where I've done that particular application.

Brian:    So when you talk about needs and you know empathizing with the customer, can you talk about like the end customer versus the business sponsor? I feel sometimes that when we talk about, "The business," sometimes people don't understand that the business is a collection of people. Just like a government

David:  Mmhmm.

Brian:    Is a collection of real humans doing jobs and work and they have goals and needs and selfish interests and all these kinds of things. So there's really a collection of end customers and the person that's you know paying for the project.

Brian:    Do you integrate that in your work and how do you how do you think about that when you're when we think about empathy, you know which is early upstream in the design thinking process.

David:  Yeah, no that's a good question alright. Um and as a consultant, it's always a bit tricky. I once did a project where I misunderstood who my actual budget sponsor was.

Brian:    Mmhmm.

David:  And I found that out a little bit too late. So, you know I spent all my time kind of tailoring to the needs of one person when you know and then several weeks or months later another person informed me like I'm actually you know your budget sponsor here. Um yeah.

Brian:    And did they have different needs? It was like they had different needs.

David:  Well but also it's... Yes, slightly different needs, but also it's very important that you maintain close communication lines.

Brian:    Sure.

David:  Right? Um so I find for myself, typically when I'm on a project I'll have uh I'll have projects who have different stakeholders than necessarily the budget holder. Uh and that's typically, you know I'll work with a budget holder to clarify expectations.

Brian:    Mmhmm.

David:  You know where should my priorities be. Um but the stakeholder for a specific project is typically the person who I'm aiming to make sure I satisfy you know their requirements and such. Because what we normally establish is that look, if this stakeholder is happy with the results then the budget holder will also be happy.

David:  Uh the difficulty happens when there's a conflict of interest there.

Brian:    Right.

David:  You know and one person is clear about wanting one thing and the other person has a different view on it. That doesn't happen so often, but from time to time it does.

Brian:    I've been in that situation before. And you know engineering's paying, you're paying for your nut and you're working with product management or you know some other department and because of those lines. But I think it's always you know important whether you're a consultant or you're internal to really understand kind of who's who's going to be evaluating the value creation here and understanding what they're looking for is definitely important.

David:  Absolutely.

Brian:    Or else your chance of failure is is high.

David:  Yeah absolutely.

Brian:    And it shouldn't be silent, be be uh suspicious of silence. And when there's not a lot of communication that's... It's always a risk for me. I just literally checked in with a client today because I'm not working with you know the stakeholder the main stakeholder, I'm working with her team and everything was fine. But it's you know when you're not hearing questions and stuff that's, it's you got to keep those lines of communication open and making sure they're seeing the work you're doing and evaluating and giving feedback on it. Uh this throw over the wall thing is a very high risk model.

David:  No, it's totally true. And I do, one of the things I do is I train uh junior data scientists. I do that on a regular basis. And one of the things I tell all the classes is, "You know make sure you put in a recurring appointment with um you know the project owner,

Brian:    Mmhmm.

David:  The sponsor um and if it gets canceled, you know reschedule it,"

Brian:    Yeah.

David:  Because you can't you can't afford to lose contact like regular contact with a sponsor.

Brian:    Mmhmm, Mmhmm. We talked a little bit about this when we originally got on the phone about you know what's going on in the industry with selling products and platforms versus you know focusing on outcomes and results. Uh the industry helps us build things. Um things don't always turn into value. So do you have some advice on how to approach this? And I'm not saying that all products are bad and platforms. I mean without some of these this new technology, none of this would be possible. At the same time, you know in the conference halls and you know a lot of people that I've talked to feel like, you know basically that the large platform providers are making out like bandits and yet every year at Gartner is like, "Oh, 86% of projects will fail this year." And it's like, "Hmm, someone's making out really well here. What is going on here?" Like-

David:  Yeah. I think part of the issue is that you know it's this thing where you can delegate um authority but not responsibility. You know and people, it's really tempting just to say, "Okay, I'm going to pass off this challenge to a product. You know um I don't really understand how to address this product or to address this challenge. So I'm going to buy a product which says it's going to meet the needs." And then later... And you don't even know how to evaluate it because you didn't know how to do the problem in the first place.

Brian:    Mmhmm.

David:  So I think that's definitely dangerous. Um and you see different tools. I don't want to name any tools, but there's definitely tools out there that sell sort of the AI solutions.

David:  Um and when you back test them, you're like, well, what is this really providing me? You know it's perhaps it's a terrible model.

Brian:    Mmhmm.

David:  Um and maybe the vendor even knows it's not a good model, but it's their model, they're selling it and we're buying it. Um that being said, I mean if you understand the challenge and you know what you're buying, then it is a way to move very quickly.

Brian:    Right.

David:  You know, it's a way there's certain tools which will let you automate repetitive work um and that's you know that's a big payoff, right? If you know what you're getting. But there's definitely a lot of snake oil being sold now in the market and I see that come up also.

Brian:    So let's jump over to um skill gaps. You were talking a little bit about some of the training that you offer so what if you're offering training, it generally implies that there's something missing there. What's missing that needs to be filled in?

David:  Yeah. There's a couple of things. One is that there's a tremendous number of people, at least over in Europe who um haven't been really trained solidly in analytic skills,

Brian:    Mmhmm.

David:  But they sort of go through bootcamp and then crossover into this analytics slash data science space, right?

Brian:    Mmhmm.

David:  So these guys are missing a lot of fundamental skills in terms of mathematics, statistics and computer programming, right?

Brian:    Mmhmm.

David:  So that's the basics. I don't do a lot of training in that myself

Brian:    Uh huh.

David:  Just because there's so many people who can do that. Uh what I focus on more is the business skills of you know once these guys are coming out of a highly technical program or bootcamp um or straight out of university, how can they place themselves within an organization and be effective?

Brian:    Mmhmm.

David:  And there's several aspects of that. One of which is understanding that the larger enterprise around them, understanding how different people think, how the nontechnical people operate, what's important to them. So that that sort of empathy and understanding your place in the company.

Brian:    Mmhmm.

David:  And then the other thing is communicating. You know and these guys, they tend to have a lot of challenges communicating around them, both in terms of orally but also visually.

Brian:    Mmhmm.

David:  And part of that is because they're used to functioning within an analytic ecosphere. You know, communicating with other people like them

Brian:    Mmhmm.

David:  You know in their programs and such. Um and a part of my training is really to help them understand, look, when you're doing emails, when you're doing presentations, PowerPoints, or charts and graphs to people in outside departments, um what is a way to communicate very effectively?

Brian:    Are there any particular like repeated things that come to mind in terms of guidance that you give? You know the five bullets or something that you kind of see as a repeating theme around the communication, particularly the visual, but also the you know written?

David:  Yeah, no there's just a lot of different stuff I cover uh to be honest.

Brian:    Mmhmm.

David:  I mean a lot of it just boils down to understanding the perspective of the people around you.

Brian:    Mmhmm.

David:  But that encompasses a tremendous amount of of different training materials that I have.

Brian:    Right, right. When you have team members that are coming from that background, how do you and the solution that you're working on has a software component, particularly you know some type of visual component, what is their role and how do you get them up to speed or how do you work with them if they're going to be involved in that solution that's going to go out? Is it just take a guess and try? Like if you're starting out from a pure data science or stats math background, how do you get to that point where you can deliver a solution that may require a software interface? What's involved in that process?

David:  Yeah, so it depends on sort of how it's being deployed. Um if you're looking at something that's going to be deployed as part of a production stack,

Brian:    Mmhmm.

David:  That's a long road to travel if you don't have the software backgrounds, right?

Brian:    Mmhmm.

David:  Because then you have to know all this stuff about you know the testing, the unit testing, um

Brian:    Right.

David:  Regression testing, um sort of everything that a developer knows in order to create robust code.

Brian:    Mmhmm.

David:  Um even then further monitoring and such. Uh and if you come out of a math program or something and you don't have that software development experience, um that's quite a bit of training. And that's actually a big stumbling block

Brian:    Mmhmm.

David:  For these guys because they know the machine learning techniques and they want to deploy something.

Brian:    Mmhmm.

David:  uh and they'll start hacking stuff together with whatever code and then they'll go to the development team or the IT and they'll say, "Hey, I built this cool model. Can you deploy it?"

David:  Um and there's no way that's going to get deployed because it doesn't meet the rigorous standards that are necessary.

Brian:    Right.

David:  So on the flip side though, when you have the software developers who already know you know how to make things um robust and such, and these guys say, "Hey, you know I want to try building a regression, building a vision network or something," um there's almost a better chance for them to be able to build something that's deployable

Brian:    Mmhmm.

David:  Because they at least had that foundation.

Brian:    Mmhmm. The um, how does this tie into... Uh so so another kind of topic that's just been in the ether I guess around me right now is this topic of model trust. Um if you're coming at it from that technical standpoint uh and if you think or believe that uh model trust is an issue. So this is do my stakeholders believe what I'm showing them. How does that play into your process? Um how do you how do you get to the point where you don't wait until you have a great model

David:  Mmhmm.

Brian:    But then you find out someone won't use it? Um

David:  No, that's a great question. That's a great question. Um in the past uh I worked with a forecasting project at a client and I had a third party working on something and they went to you know they developed some cool model and they went to the stakeholders and they said, "Look, here's our forecasting model." And the stakeholder looked at it and said, "Yeah, okay, that's kind of interesting. I'm never going to use it." Right? And then later on when I was supervising a forecasting uh project myself, what I did is I you know I told my team, I said, "Look, the first delivery we give to the stakeholders, make sure it's something super basic and super obvious you know so that they know exactly what we did, why we did it, and they look at it and say, yeah, that makes complete sense. That's what I would have done."

David:  You know give them just a basic regression or basic Arma one one model or something, not no bells or whistles. And from there, the next iteration make it a little more complex

Brian:    Mmhmm.

David:  And then a little more complex because you have to get them on board and they have to be nodding their heads, um either asking like one question at each iteration, like, "That doesn't make sense." Or nodding their heads and saying, "Yeah, that makes, makes complete sense." But if you jump into, if you go all the way to an advanced model, like you said, and try and throw that at them, if it doesn't make sense and it doesn't agree with what they already saw, then you've completely lost them.

Brian:    Yeah, yeah, yeah. Get in touch early.

David:  Yeah.

Brian:    Stay in touch. You know that's kind of the kicker there. Do you think this overall trend of success or lack there of, in the industry of, you know, formerly it was big data projects, now it's AI. Is this trending in the right direction? Like the success rate? I mean it's been consistently low for a long time. Where's it going?

David:  Yeah, I mean there's two things to that. One is that there's always a natural sort of failure rate even with well intentioned and well designed projects.

Brian:    Uh huh.

David:  But then the second thing is we're having this difficulty, having a difficulty because we used to be a bottom up push for data science projects where you know, the developers would say, "Hey, I'd love to do this, I'd love to do this." And the management said, "Stop bothering us and just do your job." Uh recently what's happened is it's been a top down, right?

Brian:    Mmhmm.

David:  So from board level they'd say, "Hey, we need to start doing something with AI, you know um with machine learning, with big data." And they'll start throwing budgets at it. But what happens then is that you've got a you know couple hundred thousand or a couple of million from board level coming down to start hiring a team and buying technology.

David:  So, these budgets that are made up you know at an annual level, after one or two years of you know building a team of 20 people or 30 people, then the budget's being renewed and people are going to start asking questions, "What value did we get?" Well, even at the inception,

Brian:    Mmhmm.

David:  A lot of these teams, there wasn't a clear mandate for them. There was a top down hope that we could leverage and buzzword, right?

Brian:    Right.

David:  But two, three years down the line, we spent the millions and we've bought the systems and everything. We weren't really sure from the start why. Um so, of course there's not tangible value after two or three years and there's still no tangible direction. And then you start to see these things burnout. You know and you've seen, I've seen some programs already where after a year or two it's sort of like where are we going with this? What's happening? But of course those that really understand what they're doing and why they're doing it, of course for these there's a very real chance of seeing value and you're seeing a lot of the large companies really capitalizing on these efforts when they initiated the efforts with a clear vision and a clear purpose.

Brian:    Following onto that then, do you do you think there's a place for, sometimes, I call it laboratory mode, right? Which is maybe the business stakeholder understands that you know we're also a data company. You know everybody says that now. I know we need to be doing something with this, but I don't understand the technology. I want to know that we're flexing this muscle, we're rehearsing, we're playing scrimmage games.

David:  Mmhmm.

Brian:    Even if we don't create value. Is there something there to be to having a laboratory kind of model, which is maybe you do let some of your top talent go try to build a you know a deep learning network or something like that, back out a project from something, not exclusively in their work, but this is like the only 20% time kind of concept. Is there some value in that or do you think it's unnecessary to be totally playground uh with no no dirt?

David:  No, no, absolutely right, absolutely.

Brian:    Mmhmm.

David:  But this goes back to your design thinking, right? So you want to have that initial impetus.

Brian:    Mmhmm.

David:  Like why are we doing this? You know where might it go?

Brian:    Okay. Cool.

David:  Yeah.

Brian:    Awesome. This has been like super good talk. Do you have any like closing advice for for listeners? You know we've got an analytics data science, technical product managers, some designers. Like I know that's a wide group of you know listeners, but if you had some closing advice for them.

David:  Yeah, I would say just as much as possible to, you know for people who are in the field and looking to bring value in the companies, as much as possible talk to people around you, you know talk to people outside of your team.

Brian:    Mmhmm.

David:  Outside of your department. Um understand what they're doing, why they're doing why they're doing it, and how you can help or how the tools that you have could potentially help. Because there's a lot of opportunities you know within companies. It's just, there's not that communication between departments. So yeah, I would just really encourage people to keep that communication open.

Brian:    Nice, nice. And by the way, how's the mandolin going?

David:  The mandolin, I'm not getting a whole lot of time with it, but I was thinking about it today. So uh more focused on the guitar these days.

Brian:    Oh okay.

David:  But um yeah, yeah, it's a really a real source of potential that I'm not tapping into.

Brian:    Excellent. Well, keep practicing.

David:  Yeah.

Brian:    Tell us, obviously I'm going to put a link to your book in the show notes, but where do you publish uh social media, LinkedIn, any of that stuff? Where can people find you?

David:  Yeah. I used to be a little bit better at publishing blogs on LinkedIn and on my website.

Brian:    Okay.

David:  To be honest, I'm really behind on that now. I've been so busy with other things, uh but hopefully at some point I'll start uh blogging some more.

Brian:    Nice. Excellent. And you're, if I recall, it's DSI analytics, right?

David:  That's it. Yeah. DSI analytics.

Brian:    Awesome. So I'll, yeah, I'll put a link to that and your LinkedIn profile as well. And thank you so much for coming on the show. This has been a really good conversation.

David:  Yeah, no, thanks for inviting me. I'm glad to be here and look forward to uh seeing you in London soon.

Brian:    Awesome. Yeah, I'm looking forward to speaking there. So take care.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe for Podcast Updates

Join my DFA Insights mailing list to get weekly insights on creating human-centered data products, special offers on my training courses and seminars, and one-page briefs about each new episode of #ExperiencingData.