041 – Data Thinking: An Approach to Using Design Thinking to Maximize the Effectiveness of Data Science and Analytics with Martin Szugat of Datentreiber

Experiencing Data with Brian O'Neill (Designing for Analytics)
Experiencing Data with Brian T. O'Neill
041 - Data Thinking: An Approach to Using Design Thinking to Maximize the Effectiveness of Data Science and Analytics with Martin Szugat of Datentreiber

The job of many internally-facing data scientists in business settings is to discover,explore, interpret, and share data, turning it into actionable insight that can benefit the company and improve outcomes. Yet, data science teams often struggle with the very basic question of how the company’s data assets can best serve the organization. Problem statements are often vague, leading to data outputs that don’t turn into value or actionable decision support in the last mile.

This is where Martin Szugat and his team at Datentreiber step in, helping clients to develop and implement successful data strategy through hands-on workshops and training. Martin is based in Germany and specializes in helping teams learn to identify specific challenges data can solve, and think through the problem solving process with a human focus. This in turn helps teams to select the right technology and be objective about whether they need advanced tools such as ML/AI, or something more simple to produce value.

In our chat, we covered:

  • How Datentreiber helps clients understand and derive value from their data — identifying assets, and determining relevant use cases.
  • An example of how one client changed not only its core business model, but also its culture by working with Datentreiber, transitioning from a data-driven perspective to a user-driven perspective.
  • Martin’s strategy of starting with small analytics projects, and slowly gaining buy-in from end users, with a special example around social media analytics that led to greater acceptance and understanding among team members.
  • The canvas tools Martin likes to use to visualize abstract concepts related to data strategy, data products, and data analysis.
  • Why it helps to mix team members from different departments like marketing, sales, and IT and how Martin goes about doing that
  • How cultural differences can impact design thinking, collaboration, and visualization processes.


Resources and Links:


Quotes from Today’s Episode

“Often, [clients] already have this feeling that they're on the wrong path, but they can't articulate it. They can't name the reason why they think they are on the wrong path. They learn that they built this shiny dashboard or whatever, but the people—their users, their colleagues—don't use this dashboard, and then they learn something is wrong.” - Martin

“I usually like to call this technically right and effectively wrong solutions. So, you did all the pipelining and engineering and all that stuff is just fine, but it didn't produce a meaningful outcome for the person that it was supposed to satisfy with some kind of decision support.” - Brian

“A simple solution is becoming a trainee in other departments. So, ask, for example, the marketing department to spend a day, or a week and help them do their work. And just look over the shoulder, what they are doing, and really try to understand what they are doing, and why they are doing it, and how they are doing it. And then, come up with solution proposals.” - Martin

...I tend to think of design as a team sport, and it's a lot about facilitating groups of these different cross-departmental groups of arriving at a solution for a particular audience; a specific audience that needs a specific problem solved.” - Brian

“[One client said] we are very good at implementing the right solutions for the wrong problems. And I think this is what often happens in data science, or business intelligence, or whatever, also in IT departments: that they are too quick in starting thinking about the solution before they understand the problem.” - Martin

“If people don't understand what you're doing or what your analytic solution is doing, they won't use it and there will be no acceptance.” - Martin

“One thing we practice a lot, [...] is in visualizing those abstract things like data strategy, data product, and analytics. So, we work a lot with canvas tools because we learned that if you show people—and it doesn't matter if it's just on a sticky note on a canvas—then people start realizing it, they start thinking about it, and they start asking the right questions and discussing the right things. ” - Martin



Brian: Welcome back to Experiencing Data. This is Brian O'Neill, and I'm excited to have my friend in Germany today, Martin Szugat, from Datentreiber. How's it going, Martin?

Martin: Thanks for inviting me, Brian, to your podcast.

Brian: Yeah, Yeah, definitely. We're going to talk about data thinking, design thinking. I was interested in talking to you because I see you as a like-minded individual that feels the need to bring design and design thinking into the world of data product creation. I think we've both seen what happens when it's not there, and I wanted to bring you on the show to talk about some of your experiences in consulting, in terms of what happens. What's the before and after for your clients that go through training, and some of the consulting work you've done? Some of these before and after stories, and why do people come to you? And then, what happens on the back end? What kind of change happens, especially in the product development process? So, can you give people a little bit of background about what you do, and then we'll jump into some of that?

Martin: As you already said, so I'm from Datentreiber. I started Datentreiber five years ago. And Datentreiber is a data strategy consulting company. So, we help clients to create value from the data. And so, my job as a managing director is especially at the client site to help them—within workshops, or consulting projects—to identify the data assets and the right data use cases, or analytics use cases that are relevant for their company.

Brian: Got it. And is most of the work you guys do, is it more on the training side or more in the consulting side?

Martin: I would say 50/50. So, when we started five years ago, it was just consulting, and then we started doing a lot of workshops. And during the workshops, I used a lot of flip charts and scribbled business model canvas or something like that. And then, I started to scribble my own canvas tools. And then, I decided to make a first draft in PowerPoint. And then, I learned, okay, I'm not the artist guy, so I asked our web designer to design the canvas. And in the end, we came up with a lot of different canvas tools that helped us for our data strategy workshops.

And then, clients ask us, “Okay, I liked your method. And I like those canvas tools, can you teach us how to use those canvas tools, and use your method for our projects?” Or even for our clients, for example, if we were working for the other consulting companies or agencies. And so we also started a training business, and now I would say it's 50/50. So, 50 percent of the time, and also 50 percent of the revenue comes from training, and the other 50 percent comes from the consulting business.

Brian: So, what happens that someone decides they need to pick up the phone, and, “we need to call Martin. We need to get some of his mojo in here to fix our problems and challenges that we're having.” What are the signals that someone feels like something's wrong, or we want to get better at the thing that you do, what makes them pick up the phone and call you, and what are some of those problems look like?

Martin: Yeah. So the typical situations are, for example, someone thinking about to invest in our DMP—Data Management Platform, or in a data lake, or they think about starting their own data lab, or data office, or whatever it’s called, or they just think about, “Okay, we have so much data and how can we utilize or monetize it?” And then they contact us and ask us and say, “Can you consult us on which is the best DMP, or CDP, or whatever?” And my first question is always, “What will you do with the CDP, DMP, or whatever? What will your data lake, your data lab, your data office—what will it do? How it will help your business?” And often the people say, “Okay, we don't know yet.” And so, “Okay, this is how I can help you. I can help you to identify the real critical use cases, and help you to prove whether those use cases make up, in the end, a business case, so that you will make money from investing time and money in data and analytics.” And I think this a typical situation for all consulting projects.

Brian: When they call you, is it the feeling that, “Hey, we're headed down the wrong path here? We're really focused on the technology build-out, but we haven't really figured out what it's for yet, and we need help doing that?” Or are they surprised when you ask that question, and it changes the conversation.” What’s that like?

Martin: Yeah, often, they already have this feeling that they're on the wrong path, but they can't articulate it. They can't name the reason why they think they are on the wrong path. For example, one client called us and he told us a story. So, they built up this data lake and integrated the thousands of data sources into the data lake. And they spent months on the implementation and the integration. They hired about 20 data scientists or so. And one day, the CEO stepped in because they have a new investor, a hedge fund. And then, the CEO asked these data lake guys, “How do you make money?” And they said, “We don’t know, yet.” And he said, “Okay, you have three months to find out.” And this was the point when they called us and said, “Okay, can you help us to think about what can we do with all the data, how to create value?” And often, a lot of clients also have the experience in failing with data and analytics projects. For example, they learn that they built this shiny dashboard or whatever, but the people-their users, their colleagues—don't use this dashboard, and then they learn, “Okay, there's something wrong.” So, technically, it's working fine, the data is correct and sophisticated, but people stopped using it after, for example, a few weeks or months. So, you see, there is no user retention. And this is, of course, because the user has no value in using this dashboard tool because it doesn't answer the questions that the users need for making better decisions and taking better actions.

Brian: Yes. I usually like to call this technically right and effectively wrong solutions. So, you did all the pipelining and engineering and all that stuff is just fine, but it didn't produce a meaningful outcome for the person that it was supposed to satisfy with some kind of decision support. So, talk to me a little bit about—you don't have to use the name of a company or a client, but I'm curious if there's a recent story that you could share about—paint us the picture of a student, or one of your seminar workshops, or a client, what it was like beforehand, and then what it was like afterwards. And I'm really curious to hear about any type of transformation that people had, too, in their mentality. I mean, the different questions they ask, or the approach that they—a light goes on through the process of working with you. This is what I see a lot with design projects is part of it is solving the immediate problem, but the secondary thing that sometimes happens is they start seeing your way of doing things, and a light has gone on and they can carry that forward to the next project or whatever. So, could you share one of these before/after stories?

Martin: Yeah, I can share two stories because one is more the common project and the other is more of the uncommon project. I start with the uncommon project because a few years ago, GfK—the market research company—asked us to help them to identify business cases for data-driven business because, of course, they knew that all this market research is in a transformation process, and they thought about how they can transform their business model. And I don't think a lot of companies are really thinking about how to transform their business model. Most companies—and that's why I said it was an uncommon project—are just thinking about how they can use data analytics to improve their existing business model, but not to change the complete business model. So, this was an uncommon project, but it was, interestingly, one of our first client projects, I think, four or five years ago.

And when we started, we built up a team of, I think, six or seven people, and then we started to think about what might be relevant use cases for their clients. And we identified nearly 100 different use cases or ideas of use cases. And then, we used a structured approach, a design thinking approach to narrow this list of 100 use cases to 50 use cases in the first step, then 25 use cases, then to 10 use cases, 5 use cases. And in the end, there were three use cases—or three collections of use cases that built three different applications, or client solutions, where all the team members said, and also the executives said, “Okay, we believe that this has economic potential, so it's viable for us and also for the clients. And we also know that the solution is desired. So, the users—our clients—want the solution.” So, we tested it in interviews, for example, and we also checked whether it's technical feasible.

And the interesting thing here was not only that, in the end, GfK launched different data-driven products which, now, is the new core of their business, but also what you could see is a change in the culture. So, they attracted new employees that had a different mindset than the people before. And also the employees started to use the canvas tools. They liked the canvas tools so much, so they asked me, “Could we have our own branded canvas tools, so we can use our tools and adapt those tools to our specific needs and use them with our clients?” And, yeah, they changed from this technology-driven, I think, data-driven perspective to a more user-driven perspective by first asking what our user—what our client—want, and not what is technical possible.

Brian: Yeah, I hear that a lot. And when I hear, “Oh, we want to be data-driven.” And I'm always thinking, “I know where that comes from.” It means we want to use data to make decisions, but there's still, many times, humans in the loop that are actually the ones making the decisions, right? The data doesn't make the decisions for us. We still have to tap into what do people need from this data to make the human decision because that's what drives things forward. Unless we're talking about fully automated solutions, which is a whole ‘nother conversation, but yeah, that's it that's interesting to hear about that.

Martin: Yeah, for automated solution—so there is always, in the process—at the beginning, and at the end, there are always humans involved. So, no process is completely automated. There is a beginning, and there is an end, and at the beginning, for example, there is a client request or anything else, an event which is associated with a client, and at the end also. And also, for completely automated, let's call them artificial intelligence solutions, you have to think first about the user and then about how to implement it.

Brian: Mm-hm. Yeah. Can you share a little bit—if we walk back on this GfK thing, and we talked about how people, they were are suffering from low engagement, which is a common thing I hear as well, in my consulting: “We built this dashboard, we built this software product or this data product, and it's too complicated or they just don't care. Something's wrong, but we don't know why.” What did you do to validate that it was wrong, or to validate that the new solution was right? Literally, walk us through how do you figure out that this design is better? How do you help them know that this new version is what they actually wanted or needed? What's that process look like?

Martin: For GfK, I cannot tell because I was not involved. So, I trained them. So, I—

Brian: Oh, okay. Got it.

Martin: —I showed them how to do it, and they did it by their own. But as I said, there is, for example, another project. This is more the common project, which was for ProSieben [German] Digital. So they are running the websites for the TV stations of ProSieben Alliance, which is one of the bigger TV networks in Germany. And so, they asked us to help them to improve the social media analytics. And what we did was we first collected different use cases and we came up with one specific use cases, which was, there is a social media manager, who screens [unintelligible] different pages and then looks for the Facebook posts of the last seven days, and then calculates the click rates, and then identify those Facebook posts that might be where he puts the ad budget on. So, to boost the Facebook post.

Brian: What’s resonating, yeah.

Martin: And he was downloading a lot of Excel sheets, and doing a lot of manual stuff. And I said, “Okay, but how much time do you spend with the solution—or with this approach?” [unintelligible], “One hour per day.” And I said, “Okay.” Let's automate this because you have to make a decision where to spend your marketing budget, and you have to make this decision every day. So, we can help you. And the solution was, what we first did, was just building a simple Excel prototype, which just collected the data from the Facebook pages, and then calculated the click rate, and sorted the Facebook posts by click rate, and then just showed him. And he said, “Wow, that's great. That saves me one hour per day.” I was like, “Yeah, great. We spent two days to implement a solution, and your effort will reduce to five minutes per day.”

And so, this very simple solution, what it created was acceptance for analytics because all his colleagues saw it and said, “Okay, so those guys are not people that want to change how we work, and that want us to control ourselves, or make us controllable, but they really want to help us. They want to help—we, as a social media manager, could improve.” And the next thing that we did, we were talking to one team, which was responsible for one TV format, and so we use a similar approach. We scribbled a social media dashboard, and then we prototyped it in Excel and then we showed it to them, and we asked them for feedback.

What we did, for example, we spent just one day with the social media team and tried to find out what, what is the daily chop? So, what are the decisions? What are the actions they have to take or make? And what are their objectives? And not what is the big objective of the company, maybe. Often it's clear, but especially, what is the objective of the single persons, and trying to help them to give them a tool to improve their daily work and help them to make better decisions. And in the end, the prototype was so successful, or the people were so convinced that this prototype helped them, they went to the other teams and showed it to them, and said, “Look here at what we got.” And the other team said, “Okay, we want this, too.” Yeah, you can have this, but then you have to also enter the data. You have to tag all your URLs, the link posts. They said, “Okay, well, because now you showed us what will be the result if we do it, we will invest the time in data management. So, we will invest the time in tagging all our Facebook posts.” And I think this is a critical thing to understand the user and help the user, not only the business.

Brian: Did you invite them into the sketching sessions with you? Did they participate in creating the solution?

Martin: I think so. It's, I think, three or four years ago, I can't remember. So, we co-developed the dashboard with them. And what we often do is, for example, to use a flip chart to wireframe the dashboards or the analytic solutions. Or now because of Corona, we are using virtual whiteboards, or we are using, for example, tools like Balsamiq—which is a wireframe tool—and then, we make a virtual session, and we are developing or designing, for example, the dashboard user interface together with the client using Balsamiq.

Brian: Yeah. My point there is that this is an organic process that happens with them. It doesn't sound like you did this totally in isolation, and then you just throw it over the wall to them and say, “Here it is.” Instead, you're servicing them literally. You're trying to work the tool into the way they do their work now, not introduce a thing you want them to use. It's empathy-driven.

It's driven by their needs and their work, and you have to connect with them to understand what does it mean to be a social media manager? What does my day look like? Where is there friction in the work that I do? And it sounded like you identified a place where, “What if I could give you back five hours a week of your time to do something else?” And that is what's going to get someone's attention not, “We have this new thing,” and then they have to figure out why it's valuable for them. It's, “Can I help you say five hours a week? Here's a way we could make your social media ad targeting process a little bit easier.” So, I like that your work’s lo-fi. You presented options early to them. It sounds like they weren't perfect the first time. It sounded like you talked about how they started to realize, “Oh, if I tag the data, then I'll get even better results.” But you didn't require them to have that in place, it sounds like. You took an initial step and they saw the value, and then it encouraged this new behavior of tagging to make the next experience better. Is that correct?

Martin: Yeah. And I think one important thing is also if people don't understand what you're doing or what your analytic solution is doing, they won't use it and there will be no acceptance. And another way to increase this user acceptance is to involve them in the development. And, for example, we have now a project where we designed in the first step, for example, of course, wireframes for the dashboards, but also so-called value driver trees, where the people identified all the relevant metrics and the drivers for those metrics. And we used also interactive approach, and co-developed those driver trees. Because in the end, they have to work with all the metrics in these value driver trees, and they have to understand it, and also they have to accept it. If they don't accept that those are the metrics that are used to measure their work, they won't use the whole data warehouse and KPI dashboard solution at all. And I think this is also the reason why so many analytics solutions fail. Because they are not accepted by the user because the user was not involved with the development and they don't understand it, and they had, also, no influence in the development of the solutions.

Brian: Do you have some advice for people listening right now? What you just said, it sounds good. And I, and I believe it, right? I understand it because I've seen the change myself. But practically, what would you advise someone to go out and do? What should I do if I'm not really doing it that way? My team works in the basement, or whatever, and we're a data science team. We're here to do data science, and if they don't want to use it, well, that's not really our problem. They told us what they needed, and we gave it to them, and we have other projects to work on. What would be some steps they could take now to go and do it the new way where the customer is part of the creation process? What would you tell them to go do today? What's the first step?

Martin: A simple solution is becoming a trainee in other departments. So, ask, for example, the marketing department to spend a day, or a week and help them do their work. And just look over the shoulder, what they are doing, and really try to understand what they are doing, and why they are doing it, and how they are doing it. And then, come up with maybe solution proposals. But, one client told me one day and said, “Martin, we are very good at implementing the right solutions for the wrong problems.” And I think this is what often happens in data science, or business intelligence, or whatever, also in IT departments: that they are too quick in starting thinking about the solution before they understand the problem.

Brian: I one hundred percent agree with that. It's a common thing. We don't often spend enough time to really understand that, and it saves a lot of time down the road. But it does require, as you said—it sounds like you go out and do what I would call a ride-along interview or contextual interviews with your customers, or it sounds like the workshop’s a little bit more of an interactive experience. But the point is you're taking your technical hat off, and you're going into their world, and you're going to observe their world, and ask questions about what's it like to be you, to do your job, to be successful in your work, to provide value the way you provide value?

And then, you figure out how do I complement that? If we're an internal analytics or data science group, how do we service them? If you can make them successful, they're going to totally dig on what you're doing, and they're going to want more, and it's going to bring the value of what you do up because you're supporting them, you're not imposing tools. So, I love that you're going out and doing that type of on-the-ground work to go figure out, what's your pain? What are your challenges? What's it like to be a social media manager? Tell me about your day? How much time do you spend with these tools? What do you hate about them? That's great stuff.

Announcer: Here's what people have been saying about the designing human-centered data products seminar.

Participant: A pretty quick and accessible snapshot of how to think about user design for data solutions products. I mean, you can go super deep on UX design; you can talk about data solutions, but how do you combine the two in a very digestible way? I think it's a very, very solid framework to get started.

Brian: Interested in joining me and a group of fellow students in the next cohort? Visit designingforanalytics.com/theseminar for details and to learn how to get notified when pre-registration begins. I hope to see you there, and now back to the episode.

Brian: Any other tips you might give to people listening in terms of, how do I scale that? Or, what's if I'm a leader in this space, what else should I—how do I get those people to even participate? Is it hard to get the social media team to agree to go spend time with my data science team or whatever? Or maybe I feel like I don't know what to say to them. Can you tell any other tips you might suggest?

Martin: So, one thing we practice a lot, and that's also why we started designing those canvas tools, is in visualizing those abstract things like data strategy, data product, analytics, whatsoever. So, we work a lot with those canvas tools because we learned that if you show people it—and it doesn't matter if it's just on a sticky note on a canvas—then people start realizing it, and they are starting thinking about it, and they start asking questions, and they're starting asking the right questions, and discussing the right things. And so, our approach is always to start with, we call it Data Thinking Workshop. So, it's applied design thinking to data science.

And then, we have this approach where we say, “Okay, let's start with a business understanding.” Where we use, for example, this Business Model Canvas. And then, we go into the user understanding where we use a canvas that's called Analytics Use Case Canvas. And then we, in the last phase, we go into the data understanding, where we explored the data landscape, and therefore it's also a so-called Data Landscape Canvas. And that really helps, especially because we start always with a mixed team.

So, we have data scientists in the team, or in the workshop, we have the marketing manager, the head of marketing, the sales manager. And often they use different terms, different language, and in the past, they just used PowerPoints and Excel sheets, but they didn't work together. And this visualization, this visual collaborative tools, helped them to find a common pass, and especially a common objective where they want to go. And this is, I think, critical for the first step to designing a data strategy, but also then to design specific data products.

Brian: Yeah. Where do people—in your experience—where did they get it wrong? Or where are they going to hit roadblocks? Is it personnel, personality types, different departments that tend to not want to participate in this? I'm wondering if there's any suggestions you can give on, “Be ready for this. This is probably going to happen, and here's how you get around it.” What do you have resistance sometimes to this? Maybe the marketing team doesn't want to participate, and the data science team was your client, and they're the ones that, like, “We really want to be more useful to the business, but the business doesn't want to come to the table,” or vice versa. Can you talk about some of the challenges someone that's trying this out might encounter the first time, and how they might work around that?

Martin: Yeah, the critical thing is that you need to mix teams, so you need people, for example, from marketing or sales that have this business perspective, you need people from data science, that have this experience in data analytics, and then you also need people, for example, from IT, that have this perspective on the data sources, and the technology, and so on. Sometimes the hard part is to get all those people in one room, but from my experience, it's solvable. So, often it's easier for us to do it because we are not data science. We are not the data science department. We are not the marketing department. We are something like Switzerland. So, we are neutral.

Brian: Yeah. [laughs].

Martin: Yeah. And, and I think this often helps also, because what we learn, of course, is sometimes there are walls between those departments, and you need someone from outside that, especially, doesn't know about those walls. So, our not knowing is the advantage here because just look at it very in a neutral way. And we don't care about the business politics because we don't know about it. And I think this helps.

Brian: Yeah, I agree. That neutral perspective can be really, really valuable to clients. Because they don't even know some, sometimes, that stuff exists. Another party may not know that these other two parties aren't communicating well, or they're not working together, or using the same language, as you talked about. And so, when you can come in and help facilitate that, I think that's great.

And I think one—I don't know, if you agree, Martin, I tend to think of—design is a team sport, and it's a lot about facilitating groups of these different cross-departmental groups of arriving at a solution for a particular audience; a specific audience that needs a specific problem solved. So, a lot of this is getting in the habit of learning how to do the facilitation part because so much of this is human stuff. The data may be there, the technology may be possible, it's all there, but we have to connect it back to the human beings on the ground, and that skillset is not Python. It's not doing our coding. It's a different thing.

So, busting through those walls is really important. You mentioned Switzerland, which is, kind of, funny. I was going to ask you—because you work in Europe and I'm guessing you probably have clients from outside of Europe—but I was curious, do you see any cultural differences in your clients about how they're receptive, or not receptive to some of these different ideas or different personality types, just because you're probably exposed to a lot of different cultures a lot more than we are here. I'm here in the states and so my clients tend to be more Americans. I'm just curious, do you adjust?

Martin: [unintelligible] funny that you're asking because this was—maybe some Germans will hate me after saying this, but there are cultural differences. So, we have a lot of clients from Germany, but also we have some clients from Switzerland, but also in the Netherlands, and also even from Russia, and from the United States. And what I said a lot of times in the past was that especially the Germans, they're afraid of putting a sticky note on the wall. And so, they are very afraid of using those canvas tools, and they need us to moderate it. And for example, we have this one guy, one client in Russia, it's Nikita, he is the Chief Data Science Officer from Siberian Airlines.

Brian: Oh yeah, I know him. Yeah.

Martin: And he was visiting our training for two days, and he went back to Siberian Airlines, and a day after, they started using those canvas tools. And it was partly in English, and part was in German, but they just put sticky notes in Russian language on it and they had no problem, and at least 30 or 40 people learned to use this tool without using our training, without asking us, they just used it. Also, we had one guy from [Peru], he wrote me a message that said, “Martin, I saw that you translated 5 of those 15 canvas, when can I use the other 10 canvas?” And I said, “Oh, are you using them?” “Yeah, since three years.” And I think, yeah, that is a cultural difference. And especially I have, of course, this German perspective.

And unfortunately, I see the adoption rate in Germany, especially what design thinking, or this collaborative approach, or also this visual approach, I think it’s, for some people, they have to become more familiar with it. So, they don't feel, let's call it, [00:34:54 unintelligible], so they have to—I cannot describe this because I don't have a problem with it. I learned this design thinking approach, I think, 15 years ago, and used it a lot of times, and so it is just part of my DNA I would call, I would say. But yeah, I don't know why is this so special in Germany?

Brian: Yeah, I can't speak to the German perspective, but one thing I talk about this—in my own training—is that, kind of in the first module actually, is that, “Welcome to the space of gray. It's no longer black and white. There's so much subjective stuff here. Ironically, it's not all data-driven and analytical. A lot of this is going to be subjective, you're going to get other people's perspective when you start getting into the head of a customer, and it may not be what you think they should be doing, or you think they're doing it wrong, or whatever.” And you have to learn to look at it from someone else's perspective. And it's not all going to be really crystal clear and black and white all the time. You have to experiment, you have to try stuff, get it in front of them, get feedback on it. And it's a very organic process.

So, I'm generalizing, but I feel like people that have very technical backgrounds sometimes struggle with operating in this, kind of, gray space, where everything isn't going to be scientific and proven out, and that's not the point. The point is, like in your case, can we reduce the amount of time that social media manager spends on ad spend? And the answer was yes. And how did you do it? Oh, we went through this really mushy process. And the reality is there's probably not one solution.

Like Martin's solution that they came up with—or that that team came up with, that may not be the only right solution, and that doesn't matter either. All that matters is, did we help reduce the amount of time this person spends on this work and increase the value of that work? And so, when you have that kind of mentality of it's not about being perfect or being right, and it's about the change, then maybe something could be let go. I don't know, maybe that's the I don't know if that's what the Germans are reacting to, or what but, you know—

Martin: Yeah. I think, yeah, Germans are well known for their engineering skills. And, of course, what you learn also in school and university doing the things right. And now, you have this veracity. I mean, I studied bioinformatics. So, I learned our whole organism DNA, it's not deterministic, it's stochastic. So, you just talking about, statistically, effects and correlations, but there is no, when people talk about the DNA code, it's not a code. It's not like if/then/else, but it's more like if this much gene mutates, the probability of this action will be increased by 5 percent or so. And I think this is, for a lot of people, this change in mindset that to live in this [unintelligible] world with [unintelligible] phrased. I mean, yeah, it's hard for a lot of people.

Brian: Yeah. Well, man, Martin, this has been super fun to chat with you about this. Is there any closing thought, or closing advice you'd give to leaders in the data science, analytics, technical product management space? What would you like to share with them, just in closing?

Martin: Of course, use our design thinking tools, give us feedback, please. That would be great because we are also co-developing our design thinking tools. And we also give them away for free as open source. So, it's also, as I said, a cool development project. And of course, the next thing is, you have to start thinking about or start exploring this whole idea, or approach of design thinking. Read some books about it, [visit] trainings, but maybe also just become a trainee in a company, or in a department that is already applying design thinking. For example, a lot of larger companies have those innovation labs. Talk with the innovation lab guys to learn how to adapt this approach to data science, data engineering.

Brian: Yes, I agree. And if there's designers listening to the podcast, I'm often a fan, it's like, go out and seek out your data groups that are out there. There's probably a mandate for AI in your company, and they may need your help. So, if you're familiar—whether you have a design by title or you have a design background—go out and seek out people working in this data spot. We need better data products out there, and we have the technical experience, most of the companies have that. We just need to make sure that people can use these tools and products at the end of the day. So, I'm totally with you there. Where can people get these tools though? Is it at the Datentreiber website? Is that the best place to find you?

Martin: Yeah. So, on the Datentreiber website, there is a short introduction, and there is also links to a platform, it's called Creatlr. And there you can find all the canvas tools in the digital versions, or you can use them on Creatlr itself, or you can download a PDF and print your own PDF. Or you can buy it on Stattys. They’ll pre-print it so you don't have to go to a copy shop. And there are also some tutorials in English language.

Brian: Got it. And Datentreiber, is that also available in English?

Martin: No, the website is in German. But from there you will see the links to Creatlr. Or just go to creatlr.com and search for Datentreiber.

Brian: Well, again, Martin, thanks for coming on here. It's been great to talk to you.  I’ll definitely link up your LinkedIn and the website for Datentreiber, and thanks for talking to us about design for data products.

Martin: Thank you so much, and have a nice day.

Brian: You too. Cheers.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe for Podcast Updates

Join my DFA Insights mailing list to get weekly insights on creating human-centered data products, special offers on my training courses and seminars, and one-page briefs about each new episode of #ExperiencingData.