Tom Davenport has literally written the book on analytics. Actually, several of them, to be precise. Over the course of his career, Tom has established himself as the authority on analytics and how their role in the modern organization has evolved in recent years. Tom is a distinguished professor at Babson College, a research fellow at the MIT Initiative on the Digital Economy, and a senior advisor at Deloitte Analytics. The discussion was timely as Tom had just written an article about a financial services company that had trained its employees on human-centered design so that they could ensure any use of AI would be customer-driven and valuable. We discussed their journey and:
- Why on a scale of 1-10, the field of analytics has only gone from a one to about a two in ten years time
- Why so few analytics projects actually make it into production
- Examples of companies who are using design to turn data into useful applications of AI, decision support and product improvements for customers
- Why shadow IT shouldn’t be a bad word
- AI moonshot projects vs. MVPs and how they relate
- Why journey mapping is incredibly useful and important in analytics and data science work
- How human-centered design and ethnography is the tough work that’s required to turn data into decision support
- Tom’s new book and his thoughts on the future of data science and analytics
Resources and Links:
- Website: Tomdavenport.com
- LinkedIn: Tom Davenport
- Twitter: @tdav
Quotes from Today’s Episode
“If you survey organizations and ask them, ‘Does your company have a data-driven culture?’ they almost always say no. Surveys even show a kind of negative movement over recent years in that regard. And it's because nobody really addresses that issue. They only address the technology side.” — Tom
Eventually, I think some fraction of [AI and analytics solutions] get used and are moderately effective, but there is not nearly enough focus on this. A lot of analytics people think their job is to create models, and whether anybody uses it or not is not their responsibility...We don't have enough people who make it their jobs to do that sort of thing. —Tom
I think we need this new specialist, like a data ethnographer, who could sort of understand much more how people interact with data and applications, and how many ways they get screwed up.—Tom
I don't know how you inculcate it or teach it in schools, but I think we all need curiosity about how technology can make us work more effectively. It clearly takes some investment, and time, and effort to do it.— Tom
TD Wealth’s goal was to get [its employees] to experientially understand what data, analytics, technology, and AI are all about, and then to think a lot about how it related to their customers. So they had a lot of time spent with customers, understanding what their needs were to make that match with AI. [...] Most organizations only address the technology and the data sides, so I thought this was very refreshing.—Tom
“So we all want to do stuff with data. But as you know, there are a lot of poor solutions that get provided from technical people back to business stakeholders. Sometimes they fall on deaf ears. They don't get used.” — Brian
“I actually had a consultant I was talking to recently who said you know the average VP/director or CDO/CAO has about two years now to show results, and this gravy train may be slowing down a little bit.“ — Brian
“One of the things that I see in the kind of the data science and analytics community is almost this expectation that ‘I will be handed a well-crafted and well-defined problem that is a data problem, and then I will go off and solve it using my technical skills, and then provide you with an answer.’” — Brian
Brian: If you've been working with data and analytics, and now AI for some time, you most certainly know my next guest, Tom Davenport. Tom has written countless books on the field of analytics and has a lot to say about this space. He's talked with leaders from all sorts of different industries, and he's had some really interesting insights to share around my questions, which I think were a little bit unique for Tom in terms of talking about design, and empathy, and how we connect our software applications, and solutions, and products that are based on data with the customers and people that are going to use them. He shared some really interesting opinions about how we're scoring as an industry in terms of the quality of the work that we're doing to actually deliver engaging services for our customers, and I think you're going to really enjoy this, this conversation with Tom Davenport.
Brian: Welcome back to Experiencing Data. I'm super happy to have... I don't know if I can call you Mr. Analytics, but we have Tom Davenport on the phone today. How is it going, Tom?
Tom: That's Dr. Analytics to you.
Tom: No, I'm just kidding. I'm just kidding.
Brian: No, it probably should be after all the books and the writing you've done. So I don't think I probably need to introduce you too much to this audience. You're a President's Distinguished Professor of Information Technology and Management at Babson College, and you do some advising to Deloitte Analytics, and like I mentioned, you've written a lot of books. So it's great to have you on Experiencing Data to share some maybe a little bit different ideas in some different topics that maybe you haven't covered before in this kind of the world of data today. So I'm really glad to have you here.
Tom: Nice to be here. Thanks for having me.
Brian: Yeah. Yeah, for sure. So the first thing I wanted to ask you was, what are the leading companies doing today to be more human and business-first in their thinking instead of tooling and data-driven? So we all want to do stuff with data, but as you know, I'm sure, there's a lot of poor solutions that get provided from technical people back to business stakeholders. Sometimes they fall on deaf ears. They don't get used. What are some companies doing to get around this and start actually making this stuff useful, usable, engaging, valuable?
Tom: What a novel idea. I mean, I know you focus on this a lot, but most organizations don't, and I... It's funny. I just had a... finished a webcast an hour ago sponsored by a vendor about kind of productionizing machine learning, and it was all about all of the different you know back-end tools, Kubernetes, et cetera to make all of that happen, but nary a word about how people sort of use this stuff, and decisions, and strategies, and actions, so I... Whenever I run across it, I think it's interesting enough I should write about it, and I, earlier this week, published a piece in Harvard Business Review about an organization that did that.
Tom: It's TD Bank's wealth management division. It's called TD Wealth, and there was a woman there called Atanaska Novakova who's kind of head of operations for TD Wealth. She felt that they hadn't really had enough demand for this sort of stuff within her business. You know a lot of supply, of course, which is what all the technology is about, but not enough people who really sort of embraced the idea and understood what it meant for their customers and their business models, and so she's sponsored a big program to address the issue.
Tom: I first became aware of it. They came down to Cambridge, and they were meeting with some MIT people, and I think I was expected to provide the sort of you know how's AI really being used in business component. But then, they kept running several versions of this, and they went to places like Israel, and Silicon Valley, of course, and the UK. But I think more importantly, the goal was to get people to sort of experientially understand what data, and analytics, and technology, and AI are all about, and then to think a lot about how it related to their customers, so they had a lot of time spent with customers, and understanding what their needs were and so on, and you know try to make that match with AI.
Tom: And it was rare enough, so I thought I'd write about it. If you survey organizations and ask them, "Is your company... you know does it have a data-driven culture?" they almost always say, "No." Even surveys show a kind of negative movement over recent years in that regard, and it's because nobody really addresses that issue. They only address the technology and the data sides, so I thought it was very refreshing.
Brian: On that topic then, if we're on a scale of one to 10 on whether or not we're producing desirable outcomes and not just building plumbing, and infrastructure, and the theoretical ingredients to bake a cake that people actually want to eat, if we were at one 10 years ago, where do you think we are now if 10 would be like the optimal sense of data is a natural part of decision making, tools, and solutions, and models naturally fit into the course of people's jobs and the work they're doing, customer experience? If we were one 10 years ago, where are we now?
Tom: Well, you know maybe two, two and a half.
Tom: Maybe three. I just... the sad thing is we don't really know in that nobody really measures that very often. I mean, I once suggested and wrote a little piece for the International Institute for Analytics that's in the public domain I think about... What did I call it? Production Score. Something like that. Your percentage of analytics projects that actually make it into production deployment, and I asked a few companies what their percentage was. And some said zero, which was really depressing.
Tom: One woman who really focused on it a lot, she works for Kemper Insurance now, said it was a hundred, and she really prided herself on that, but most organizations don't even measure it. Even if it goes into production deployment, that doesn't mean that people are actually using it effectively, so...
Tom: I always ask, you know "How is this being used?" and decisions and actions are being embedded in products and services, or whatever. And most of the time, people don't even know.
Brian: So why are we spending millions of dollars on it? Is it because, "Well, we need to have all this infrastructure in place," and that's what the technical people said we need to do before we can do any of the "cool stuff," and so we're all still just at this building blocks phase? We're making ingredients, but we're not creating any dishes yet? Is that kind of where we're at still?
Tom: Good for the economy, Brian. That's why we're doing it.
Brian: I know.
Brian: I know. We got to keep Amazon Web Services in business, right? So.
Tom: Exactly. Jeff Bezos doesn't have enough money. He needs more, so. No. I mean, I don't know. I think... Why are we doing it this way? We sort of think it's the important part to do, and we sort of assume that when we have the water and we lead the horse to it that it will drink, but it's always been problematic. You know I think slowly, people do... you know it's, that's why I said it was one 10 years ago. It's two or three times better now, but it's still a long way to go.
Tom: Eventually, I think some fraction of this stuff gets used and is moderately effective, but not nearly enough focus on that set of issues. And we also don't have enough people who make it their jobs to do that sort of thing. I mean, a lot of analytics people, I think their job is to create models, and whether anybody uses it or not is not their responsibility, so it is not the data scientist's responsibility. And by the way, you know how many data scientists are really trained on any sort of human psychology, sociology, anthropology, et cetera? There are just aren't very many people whose job it is to do that.
Brian: Mm-hmm (affirmative). Mm-hmm (affirmative). Do you feel like that there's a natural progression to, to fix that through the process of upskilling the data scientist on these non-technical skills, or does it fall to other people to do that and it's like new roles that need to be hired, or the business people need to fill those shoes? Like where do you see that going?
Tom: Well, I mean, the good news is a lot of the things that data scientists do are increasingly being automated, so I've been a big fan of automated machine learning because it empowers you know somewhat quantitatively oriented business people to you know to create models often quite successfully without the need for a huge amount of technical training. And I was saying this this morning on my webcast. I was working with a local, but I guess also global automated machine learning company, DataRobot, and they said, "Would you talk to some of our customers?"
Tom: And some of them were you know more oriented or, "What does it do for data scientists?" But I talked to a guy at the Royal Bank of Canada, which has always been one of the more sophisticated users of technology and data in Canadian banking, and I think they're the largest and most successful company in Canadian banking.
Tom: And this guy was a pretty senior manager. He said, "Look, you know I have a choice. I can work with a data scientist who probably doesn't understand my business, probably doesn't care that much about my business, knows a lot about you know developing models, but doesn't know anything about my customers, or my business model, or anything like that, and probably won't you know care whether it gets successfully implemented. They just want a kind of a unique modeling challenge, or I can work with a business analyst who understands my business, and knows my customers, and knows the people who are going to implement these tools. And frankly," he said, "I'll take the latter whenever I have a choice about it." And so it's good, I think, that we're going to have some options and this tyranny of the you know highly, technically, analytically focused data scientist will diminish, if not end.
Brian: You had written another article I had written about Pfizer, and they have an analytics and AI lab that's centralized within the business. And this got me thinking about in the you know in the field of product design and user experience design, there's different models where sometimes design is a centralized organization, and sometimes you have designers that all belong to different business units in it. So I wondered in this, in the field of analytics and AI, are there factors that go into deciding whether to park all those people in a central area or to start pushing them out into the business units and helping them learn the domains they're working in? Do you have any insights to offer about how to choose which of those models works?
Tom: Well, yeah. I've really seen it evolve over time. So when I first started writing about analytics in the early 2000's, most organizations didn't really recognize analytics as a strategic resource, and so they had it highly decentralized. You know a little bit in marketing, a little bit in risk management if it's a bank, a little bit in maybe quality or whatever operations. And then, as they realized that it was strategically important, they said, "Wow, you know we really need to kind of get a critical mass here and attract people to it, and so let's centralize it."
Tom: And a lot of them did, and they started to at least have a center of excellence, if not have everybody report to the same organization. At the time, I thought that was generally good. It's recognized as strategic. There... One of my co-authors on Analytics, Jeanne Harris, did some work at Accenture, suggesting that centralized organizations were more likely to lead to higher job satisfaction on the part of the analyst and also greater retention levels.
Tom: But then, over the last few years, I've seen it move back in the other direction in that when everybody starts to realize that analytics is important, you have business leaders say, "Great. You know it's been wonderful having you work with us as a part of a centralized analytics group, but I'd really like you to be part of my team, and so you can really understand my business and so on."
Tom: And so in a lot of quite sophisticated organizations, the flow has started to go back out again, and you know I think both centralized and decentralized have their advantages and disadvantages. But if you really want to get people who understand the business and who care about the business, that probably leads you in a more decentralized direction. You can still have some ties to a central group, you know some central coordination a little bit, but it's maybe the second most important structural unit rather than the first. You know it's kind of a matrix where one is a solid line, and the other centralized is a dotted line.
Brian: In the case where you're pushing your data specialist out into the business units, do you see... Are you familiar with whether or not the additional non-technical skills that are required to produce successful models, software applications, new products and services, does all of that go into those business units? Do you think like as a team? Like for example, you need software engineers if you're building an application, right? You might have a predictive model that you've built, but that needs to get implemented into an application, which then may have a user interface piece that needs to be designed, and you know and so on, and so on, and so on. Are you seeing any movement to have those teams be created within these business units as well or not so much?
Tom: Well, less so I would say because those are some pretty sophisticated technology skills, and you tend to find those in IT organizations, but... I mean, there you know there are now, have always been, probably always will be, these sort of renegade IT groups that are found in various parts of the business. Shadow IT, people refer to it. There's a lot of names for it, and people can do some pretty high-quality work there. You particularly find that in marketing
Tom: These days where Gartner has argued that there's more money spent on technology in marketing than there is in IT.
Tom: I never saw any real data collection about that as it's sometimes the case with Gartner, but more and more all the time, and technology, official technology groups are then marketing, so I don't know. In a way, that's just another form of this centralized approach to decentralization in a way, and that you... If you move it into marketing, you have a well-established group that does technology, and marketing, and so on, it's kind of centralizing it in a fashion,
Tom: But not quite as centralized as having one group to serve the entire organization, you know all functions, all business units.
Brian: We talked about this like movement from one to two, and we could say, "Oh, it's 100% increase." Right? But to me, two is pretty lousy. So at some point, someone... I would think someone's job is at risk if you continue to work at a quality level of one to two, and I'm curious. At some point, is there a... Is the business landscape changing in terms of how many at-bats you get before there's a strike out, and it's like bring in the new team?
Brian: I actually had a consultant I was talking to recently who said you know the average VP/director or CDO/CAO has about two years now to show results, and this gravy train may be slowing down a little bit where it's, "Here's millions of dollars. Go ahead and do some of that data stuff, and let us know when it's done." Now, it's, "What did I get for that?" Are you seeing more questions like, "What am I getting for this?" And if it's not happening, there's job turnover, like leadership change-out? Like at what point does it start to matter?
Tom: Well, I think at those senior levels, it really does matter a lot. You know one of the... I've just done some recent work on what's the job of the CDO and the chief data officer, and I think there's less and less patience with this sort of, "Oh okay. You're going to fix up all of our data. Here's a several million dollars for a master data management project, and tell us when it's done." And first of all, they're never done very successfully. I think this becomes kind of an abstract effort in data modeling that rarely produces anything of value, and so those CDOs do get fired.
Tom: And unfortunately, you know we haven't had a clear definition of what that job is about, so they go on and get another job elsewhere. One of my answers is that you know there are really a lot of different jobs involved in being a senior data executive, chief data officer, whatever, and the ones that really provide value or the ones where the value is measurable are what I call the offense-oriented jobs, building applications that really get used or building products and services that have data and analytics embedded in them. It's you know, being a chief data and analytics officer is inherently I think more valuable than just working on the back office data management stuff.
Tom: Now, some of those objectives are a little too difficult to achieve. I think data monetization is something that a lot of organizations talk about, but it's really hard to do, and so that's likely to lead to some you know short job tenures as well if your focus is purely on monetization. But if you know if it's creating applications, and products and services that have data and analytics as a key part of them, then you know that's doable, and I think organizations appreciate that.
Tom: At lower levels, it's not really fair to blame the people for that when their bosses aren't telling them what they should be doing and how their success should be measured, but there have been some examples of that at Procter and Gamble, which was you know one of the first organizations I worked with. I created a really centralized group of analysts. They had them deployed to different parts of the business and embedded within them, and they were... Their primary evaluation criterion was to have the leader of that business say whether the analyst, embedded analyst had helped to produce at least one breakthrough business result over the past year. And I thought that was a great idea, and it worked quite well for them. I think it ensured a dialogue between the business leader and the analyst, and I think finally they thought it was working so well, they didn't even need to do it anymore.
Brian: Is there an ingredient from the tech world and the digital natives that you think is perhaps omnipresent, but it's just not being adopted by the non-digital natives, a skillset, a way of working, whatever it may be that they could be adopting today and perhaps seeing better outcomes with their work that's just not natural?
Tom: Well, I do. You know A few years ago, I wrote a book or co-authored a book called "Only Humans Need Apply" about how AI affected jobs and skills, and it was really about this idea of augmentation that a lot of jobs would be augmented by smart machines. I set out to find a number of people who are already working them, and I found a fair number, interviewed them, and put them in the book. And I would sort of ask them, you know "What made you do this? How did you end up being someone who was comfortable working closely with machines?"
Tom: And they all said... I mean, they had various explanations, but they all said, "Oh, I was always really interested in technology, and how it worked, and how it could make me more effective, and so on." So I think, I think we're all now in a position where we need to do that. I don't know how you inculcate that exactly, how you know you teach it in schools, but I think we all need curiosity about how technology can make us work more effectively. It clearly takes some investment, and time, and effort to do it. Not everything is going to work. You you know sort of try it out and see if it's a possibility.
Tom: But you know I remember talking to a guy recently on a podcast, and he said, "I was a radio broadcaster, and as some of those radio jobs started to go away, I said, 'Well, maybe podcasts are the answer.'" But most radio people didn't do that and didn't jump into podcasting, so it's not a widely distributed trait, I wouldn't say, and we haven't figured out as a society how to broadly inculcate it.
Brian: Mm-hmm (affirmative).
Brian: In terms of solving problems, like one of the things I talk to my subscribers, and my clients, and my audience is about the mindset of the designer, really good designers is to go out and be problem-finders just as much as they are problem-solvers. And one of the things that I see in the kind of the data science and analytics community is almost this expectation that I will be handed a well-crafted and well-defined problem that is a data problem, and then I will go off and solve it using my technical skills, and then provide you with an answer.
Brian: And I think part of the challenge here is when you're creating applications, and services, and products is that the problem is rarely well-defined upfront, and you may have a person asking you for something that sounds right on the service and maybe it's partially properly informed, but there's often a good chance that it's not well-informed. So the classic example like, "Can I have some machine learning please on this next product?" That's not a, you know that's a tactic. That's a potential implementation model for something, but it is not a problem.
Brian: So if you have, you know if you had a client or a you know a consulting client, and a company say, "My..." from the data department and they're saying, "My business stakeholders don't know how to provide you know my data science and analytics team with good problems to work on," how would you suggest that they get better at figuring that out? Is it their job to go and work with the business people to figure out what is actually needed, like what is the opportunity, or is that a training on the business side that needs to happen, and you kind of see the data science and analytics groups as like they're there once it's well-defined, then come back to us? How do you see that?
Tom: I think I agree with you in that it should be a responsibility of the data science and analytics people, and it's... I agree that it's often not done very well and I... you know it's hard. It's hard to systematize it or teach it. I wrote another book a number of years ago called Keeping Up with the Quants with a Korean statistics professor, and I ended up having to write the sections because that was less statistical about you know the beginning and the end of the analytical problem solving process, and at the beginning is framing the problem is what you're describing.
Tom: And man, it was a really hard chapter to write because you know it's very unstructured. It involves asking a lot of questions. It involves sort of thinking about how other people have solved similar problems you know and kind of seeing if eventually, translating it into a technology and a set of data that might address that problem, but it's really, really hard to systemize. And then, the last part was communicating and acting on results.
Tom: That was a little bit easier to describe, but again, one that is often not done very well by analytical people, so you know I think it's... You can turn over the middle stages pretty well to an analytics person that's sort of solving the problem related stages, but the framing the problem, and communicating and acting on the results are, really need to be partnerships that involves close relationships.
Tom: There was a guy I used to work with. He was at Procter and Gamble. Now, he's running an analytics program at the University of Cincinnati, Glenn Wegryn, who said, "We're not selling analytics or models. We're selling trust, and that only is going to be established by longer-term relationships between the analyst and his or her client, the business leaders." Not something that a lot of analytics people are necessarily that great at doing is establishing that kind of trust and relationship.
Brian: Sure. From the earlier conversation though, it sounds like if, it sounds like if these teams, these data teams and analytics teams want to be successful on the long-term, they're going to have to up their game I would think because at some point, either people are just going to stop using these tools completely. I mean, not that they're maybe heavy adoption already, or it's going to be changed over, and someone is going to come in, and they're going to have a process for doing it that's better.
Brian: They're going to have well-armed people that know how to have a good why conversation with somebody, "Why are we doing this?" They're having... you know "How might we solve this conversations before there's any technology created past prototyping, getting small solutions out quickly to get feedback, and build that kind of trust. I mean, to me, it seems like it's such a ripe opportunity for groups that don't look at themselves as only being technical in their work, but they have a responsibility to deliver outcomes, not just outputs. I don't know. Am I crazy? Like...
Tom: No, but I you know I think, as I was saying earlier, I think it's more likely that you'll have machines taking over the solving the problem stages, and it will be the relatively less technical business people who will do the you know the framing of the problem and the acting on it when it's done, and that you know computers are not going to be good at that for a long time.
Tom: But they are pretty good at solving the problem, and doing feature engineering and figuring out how to create the best fitting model, that part will be automated. And what will be left for the humans to do is the stuff that we're talking about.
Brian: Mm-hmm (affirmative). One of the tools... This actually dovetails to my next question well, and one of my favorite design tools or tools, whether you call it design tool or not is called journey mapping, and this came up in an MIT Sloan article you wrote recently, and you mentioned this is this is one way for you know to enable better digital transformation. Can you talk to me a little bit about how that came to... How, what was your exposure to that, and how did that end up in this article?
Tom: Well, I've talked about for a while with companies and vendors that did it, and basically, I think you know it's very enlightening to see the path that customers follow as they're trying to you know get their job done with a company. You know you're, say you're trying to accomplish a particular financial transaction or you're trying to buy something online, and often, people end up having a pretty circuitous root through various, the various channels that companies provide. You know they call, they look on the website, which is always what the company wants them to do, but they can't get everything they need, and so they call some sort of customer service line.
Tom: No, probably. They don't end up writing letters anymore about it, but they... you know and banks all the time do this journey mapping,
Tom: And they see that just trying to do something simple, supposedly simple like replacing a lost ATM card or something can cause you a huge amount of problems. I mean, I am having this now. My nephew. I invested in a business, and he sold it, and he's trying to send us some money, and he, I suppose, had the wrong address, so he tried to wire the money, and it's kind of... The money is lost somewhere between Bank of America and Wells Fargo. I don't know. They're both blaming the other. It's horrible.
Tom: And if, you know if the amount of money weren't fairly substantial, I'd say, "The hell with it. I'm just going to abandon both the money and the relationship with the organization,"
Tom: So, but organizations you know inevitably find out that it's a pretty tough life that we subject customers to, and then they start to try to simplify it. And you know I think we got a long, still got a long way to go in most organizations from making customer journeys simple.
Tom: But in that article, I think it was actually my co-author, Andrew Spanyi, who put it in, but we were just arguing that most of these digital transformations don't really focus enough on making things better for customers. On many cases, they're involved in saving cost or internal kinds of transactional things that maybe benefit the company, but don't help the customer at all. And we were just kind of arguing that. If you're going to do a digital transformation, your primary purpose should be on making life better for customers.
Brian: Yeah, yeah, and the, I think, journey map is a great way to get cross-functional teams to see it because when you can actually visualize how someone moves through, all the touch points of your company, whether it's interactions, or using applications or physical products, and just seeing what it's like. And then, you know when you map out like, "Well, here's the customer experience," and then underneath, it's like, "Here's what's happening internally. And well, you wonder why it doesn't work. This person has actually touched nine different business units in order to do this money transaction."
Brian: They don't care about any of that, right? You just want to get money from A to B. I don't care if it went to a holding bank for 10 minutes and, "Oh, you want to do a name change," and that's in a centralized data or it's not in a centralized database. You have to update it in four places, and then the name has to match. And blah, blah, blah. It's a nice way to actually start to visualize the friction and help groups see it.
Brian: It's like, well, we could do our part right, but if we don't get you know the rest of the journey fixed, then all this great stuff we did perhaps with data or AI is going to fall flat at step four because they're not in the room. They're not participating with us, and you know you're not looking at it from a customer perspective. The internal business divisions are starting to define it, so I think it's a really powerful tool for really helping teams see and visualize that journey.
Tom: I do too, and I think you know that brought a lot of transparency to what customers experience. And now, I've been doing some work lately in this relatively new area of process mining, which lets internal processes understand much more what's really happening in the process, and you can see, "Oh, what's, how is the process supposed to work?" Some people call that the happy path, and then you see all the different variations that people performing that process say, "It's an order to cash kind of process end up going through."
Tom: And in many cases, they're not taking the happy path at all. They're taking all sorts of side roads,
Tom: And it's, I think, really quite interesting that in both customer-pacing processes and in internal processes, we're starting to see through analytics all of the crazy variations that we have. As somebody said, I was at an event yesterday with this company, Celonis, which is the leader in that industry of process mining. Before we started to use these tools, we had, we touched POs so often that we needed hand sanitizer. I mean, they're just so many operational screw-ups within organizations that are becoming more and more transparent now.
Brian: Mm-hmm (affirmative). Mm-hmm (affirmative). I would say though one, to me, one of the opportunities that is out there, especially if you want to talk about going from good to great is how do you handle, how do you handle the user experience, and this could be for an internal customer or an external customer. How do you handle it when they don't go down the happy path? I think we've all seen it before where you're filling out an application and maybe you answer something weird, but then the application is like, "Oh, hey, we noticed that you know you don't... Oh, you don't have any kids. Would you like us to skip past these parts because we know that these doesn't apply to you?" You're like, "Yes," and then it asked you some other question and you're... There's like a little moment of kind of delight and surprise that they anticipated your need before you had it or at least they accommodated that.
Brian: And while we can't always design a great experience for every single you know branch of this decision tree, I think a lot of places get it wrong where something basic, and it just fails. "Please call 1-800-9. You know ask for New Accounts." And it's, I mean, I was on FedEx trying to get a FedEx nine-digit account number just so I could buy a shipping label, and it was unbelievable how bad the process was, and I'm just trying to... I'm trying to give you my money here.
Brian: Then, you call them up, and they can't even figure out how to like... "I'm sorry, but New Accounts is closed," and it's like, "All right." You know it's like I literally can't give you my money right now, and it's like this is really basic stuff. So I think when you journey map that of someone that had actually gone through and looked at that process of, first of all, why are there three different account types when you, you know on fedex.com? There's like online account, an account with an account number, and then there was some other type of account. It's like, "I don't know. I just want to ship a label. I just want a shipping label." You know, so I think it's a good way to you know see this friction, and a lot of low-hanging fruit to me out there to be solved.
Tom: Yeah. I think for years, I argued... I don't know if you've talked about this in your podcast, but for years, I argued that corporate anthropology was the answer to a lot of these things where you know we do ethnographies on how people do their work and so on. I think you know I think we need this new specialist of data ethnographers so you could sort of understand much more how people interact with data and applications, and how many ways they get screwed up.
Tom: But I'm not sure there's anybody being trained in that specialization.
Brian: Well it's, I mean, that's a core competency of human-centered design, right, is developing empathy, and the first step of this involves going out and talking to people. And actually, it's mostly about listening and learning to ask good questions so that you can tease out some of these situations because a lot of them are not... They're not going to be handed to you on a silver platter like they may not elegantly explain what was wrong. They just know the symptom of like, "I don't know why. I just hate that app. I have to open Tablo, and it's like a million clicks and," and that's the end of it.
Brian: They're not going to say, "The reason why I don't use this chart is because the default period is only nine days, and I really need it 12 days, and I wish it was the bar chart instead of a line chart." That's, they're not going to voice the problem in that way, and so you know this is something I'm trying to work on in my training seminar with data scientists and analytics people is this process of going out, and as you call it, ethnography, but it's doing the research, it's shadowing people to learn about their jobs so that you can figure out, "How do I slot in my solution in a way that naturally, hopefully, they'll even want to use it because they will start to see it as an advocate?" Your solution is helping them be more productive, more successful in their work as opposed to attacks, right? "Oh, here's another thing I got to use."
Brian: Another imposed solution that I had no say in how it was going to be implemented, but we're forced to use it in our job. This is why stuff doesn't get used, and we need to have these conversations, or else we're just going to keep pumping out code, keep pumping out models, and then they're going to hit the floor, and then we move on to the next one. You know?
Brian: Jeff Bezos makes another few thousands, so. We're getting closer, and this has been a great conversation, but I... You've written all these books, and I wanted to ask you something. You recently put out The AI Advantage: How to Put the AI Revolution to Work, and in your summary, you mentioned key focal point was, you know "Don't go for the moonshot. You know look for the low-hanging fruit."
Brian: Can you help?.. Maybe help a team, especially one that's a lot of them see that we need to have this giant infrastructure in place, and you know the data engineering piece, and the data pipeline needs to be in place, and it's kind of like all these ingredients, right? We need to buy all of this stuff at the store before we're ready to cook. How do you work small? How do you bite off a low-hanging fruit when maybe there's a perception that I need all of these ingredients before I can cook anything that someone wants?
Tom: Well, you know first, I argue yes, you should pick the low-hanging fruit instead of aiming for a big moonshot, but I think the low-hanging fruit can be in service of a much larger objective. You know?
Brian: Mm-hmm (affirmative).
Tom: I sometimes say, "Think big. Start small."
Tom: And thinking big might... You know pick an area of your organization that really needs transformational change maybe like your customer relationships and say, "Well, how can I, how can I do that? You know I could build a chatbot, but you know that's likely to only be one little piece of this. You know maybe I should do some journey mapping with machine learning. Maybe I should have even some simple rule-based system." So each application in AI is typically quite narrow, but it can be in the service of a larger objective that could really be transformational if you did enough of it. And I think that transformational objective is what might justify the big investment in infrastructure. Each individual application you know may only use a piece of it, but together, you know it would be worth all of that effort and spending.
Brian: And looking out to the future, you've done so much already in the field of analytics and business. You've written a ton and offered a lot of insight. Where are you looking when you think ahead? You know what's next for Tom D.?
Tom: Well, I'm starting a new book on... It's somewhat like my... Not the last book I wrote, but the one before that on Only Humans Need Apply, but it's kind of a... I don't know if you remember. There was a guy named Studs Terkel who wrote a book called Working about how people work. They interviewed a lot of people and so on. So what I'm trying to do is find examples of how the future of work is already here, the future of work now, and it's people who are already working in close relationships with machines, and the machines are doing some of the intellectual heavy lifting, but people are still doing something quite useful as well.
Tom: And I don't know. I just published in Forbes my first example of this about a digital life underwriter at an insurance company who now only looks at the hard parts of an application because the easy parts are all done by machine. You know I hope to find, I don't know, 40 or 50 of these things and put them together in a book, so I've got a long way to go I guess.
Brian: That's great. Any closing advice for our listeners that you'd like to convey? This is, you've shared so much with us already, but I wanted to give you a chance to kind of have some last words on that.
Tom: I got an email from a guy I have worked with, a guy named Tom Redman, who's an expert in data quality. He's, I don't know, PhD in some technical field. He worked at Bell Labs, and it was... He had looked at this piece about TD Wealth and how they were changing their culture to be more data and technology oriented, and to do human-centered design in their business, and he said, you know "That's really the hard stuff. It's great that they're taking that on." So I would just like more companies to adopt that approach and do something other than just you know throwing technology at the problem.
Brian: Fully agree.
Tom: I'm sure you do.
Brian: Yes. Tell people where to find you. I mean, you're pretty easy to find. Is tomdavenport.com the best place, or how do you like people to call you?
Tom: Yeah, that's probably the best place, or connect with me on LinkedIn. I tend to rarely have a thought I don't publish, and most of it ends up on LinkedIn in some way or other.
Brian: Awesome. Well, thanks again for coming on Experiencing Data today. It's been great to have you.
Tom: My pleasure. Thanks for orchestrating it all.
Brian: Yeah. Cheers.
Tom: See ya.
Brian: As a follow-up to my conversation with Tom Davenport, we talked about some of these skill gaps that are missing in the data science and analytics space around being a problem-finder, about talking to customers to figure out what actually needs to be built to help people have a better experience or to do their job better. There actually is a place to get this training. In early 2020, I'm really excited to be piloting a new seminar called Designing Human-Centered Data Science Solutions.
Brian: If you're tired of building solutions that they don't get used or they're under-valued by your business sponsors or your end-customers, I hope you'll join a small cohort of students. We're going to work together as a class. It's going to be delivered online to learn how to use human-centered design techniques to make the data work that you do more compelling to make it stick. I'm sure nobody likes to work on you know solutions for potentially months and months at a time, and then they fall on the floor, and they don't get used.
Brian: My goal is to help you learn some of the upstream skills that mix product management, product design, and consultation skills that will help you figure out what customers really want and how to prototype quickly and get feedback quickly before you invest too much time, energy, and money into building something that doesn't end up getting adopted by your customers.
Brian: So if you're interested in this, please go to designingforanalytics.com/seminar. There's an early pre-registration button there you can click, and the first cohort of students is going to get 50% off registration, and you can also contact me, Brian, B-R-I-A-N, at designingforanalytics.com. If you have questions or you're interested in sending in a team, we have some special team-based pricing as well, so it's great for analytics translators, for analysts, data scientists, and analytics practitioners as well as technical product managers who may be trying to kind of upskill their design chops. So I hope you'll join me. It's going to be really fun. Take care.