140 – Why Data Visualization Alone Doesn’t Fix UI/UX Design Problems in Analytical Data Products with T from Data Rocks NZ

Experiencing Data with Brian O'Neill (Designing for Analytics)
Experiencing Data with Brian T. O'Neill
140 - Why Data Visualization Alone Doesn’t Fix UI/UX Design Problems in Analytical Data Products with T from Data Rocks NZ

This week on Experiencing Data, I chat with a new kindred spirit! Recently, I connected with Thabata Romanowski—better known as "T from Data Rocks NZ"—to discuss her experience applying UX design principles to modern analytical data products and dashboards. T walks us through her experience working as a data analyst in the mining sector, sharing the journey of how these experiences laid the foundation for her transition to data visualization. Now, she specializes in transforming complex, industry-specific data sets into intuitive, user-friendly visual representations, and addresses the challenges faced by the analytics teams she supports through her design business. T and I tackle common misconceptions about design in the analytics field, discuss how we communicate and educate non-designers on applying UX design principles to their dashboard and application design work, and address the problem with "pretty charts." We also explore some of the core ideas in T's Design Manifesto, including principles like being purposeful, context-sensitive, collaborative, and humanistic—all aimed at increasing user adoption and business value by improving UX.

Highlights/ Skip to:

  • I welcome T from Data Rocks NZ onto the show (00:00)
  • T's transition from mining to leading an information design and data visualization consultancy. (01:43)
  • T discusses the critical role of clear communication in data design solutions. (03:39)
  • We address the misconceptions around the role of design in data analytics. (06:54)
  • T explains the importance of journey mapping in understanding users' needs. (15:25)
  • We discuss the challenges of accurately capturing end-user needs. (19:00) 
  • T and I discuss the importance of talking directly to end-users when developing data products. (25:56)
  • T shares her 'I like, I wish, I wonder' method for eliciting genuine user feedback. (33:03)
  • T discusses her Data Design Manifesto for creating purposeful, context-aware, collaborative, and human-centered design principles in data. (36:37)
  • We wrap up the conversation and share ways to connect with T. (40:49)

Quotes from Today’s Episode

  • "It's not so much that people…don't know what design is, it's more that they understand it differently from what it can actually do..." - T from Data Rocks NZ (06:59)
  • "I think [misconception about design in technology] is rooted mainly in the fact that data has been very tied to IT teams, to technology teams, and they’re not always up to what design actually does.” - T from Data Rocks NZ (07:42)
  • “If you strip design of function, it becomes art. So, it’s not art… it’s about being functional and being useful in helping people.” - T from Data Rocks NZ (09:06)
  • "It’s not that people don’t know, really, that the word design exists, or that design applies to analytics and whatnot; it’s more that they have this misunderstanding that it’s about making things look a certain way, when in fact... It’s about function. It’s about helping people do stuff better." - T from Data Rocks NZ (09:19)
  • “Journey Mapping means that you have to talk to people...  Data is an inherently human thing. It is something that we create ourselves. So, it’s biased from the start. You can’t fully remove the human from the data" - T from Data Rocks NZ (15:36)
  •  “The biggest part of your data product success…happens outside of your technology and outside of your actual analysis. It’s defining who your audience is, what the context of this audience is, and to which purpose do they need that product. - T from Data Rocks NZ (19:08)
  • “[In UX research], a tight, empowered product team needs regular exposure to end customers; there’s nothing that can replace that." - Brian O'Neill (25:58)
  • “You have two sides [end-users and data team]  that are frustrated with the same thing. The side who asked wasn’t really sure what to ask. And then the data team gets frustrated because the users don’t know what they want…Nobody really understood what the problem is. There’s a lot of assumptions happening there. And this is one of the hardest things to let go.” - T from Data Rocks NZ (29:38)
  • “No piece of data product exists in isolation, so understanding what people do with it… is really important.” - T from Data Rocks NZ (38:51)



Brian: Welcome back to Experiencing Data. This is Brian T. O’Neill. Today I have T. from Data Rocks New Zealand, NZ, N-Zed. I guess it depends on where you learned your English, right? We say NZ. But welcome T. What’s going on?

Thabata: Yeah, yeah, here we say N-Zed. But you can also call New Zealand, Aotearoa, which is its Maori name.

Brian: Excellent. Excellent. And I’m curious, is it like, “Data rocks, man. It totally rocks,” or is it like, data stones? Like what’s the name [laugh]?

Thabata: It’s a little play on both things. So, there’s a little bit of a story there. The first one is that when I moved to New Zealand, the thing I saw the most were mountains. I now live at the foot of a mountain. It’s Mount Taranaki. It’s a volcano. So, there’s mountains everywhere. You see that my little logo is a mountain. So, I kind of got from that the ‘Rocks’ part of it. I used to work in the mining industry before as well, so there’s a little bit of coming from that story. But I think that rock as in, “Yes, this rocks, man.” So yeah, it’s a little bit of a wordplay there [laugh].

Brian: Excellent, excellent. I really like that. So, is it fair to label you as a data designer or a data visualization consultant? How do you self-identify? You’re in that world, but I want to know how you call yourself?

Thabata: Yes. I identify these days as an information designer first and foremost, but I am also kind of a data product designer of sorts. I started developing dashboards, as everybody does. I started with business intelligence and being a data analyst, and then I kind of dived more and more and more and more into the design world. These days, I’m more of a data visualization consultant, if you can call it that. I personally call myself a one woman band because I am a consultancy of one [laugh]. So yeah, that’s who I am.

Brian: Near and dear to my heart. We’re kindred spirits from thousands of miles apart [laugh].

Thabata: [laugh] Yeah, exactly.

Brian: Yes. So, I want to kind of jump into—we haven’t done… it’s been a little while since I’ve done an episode with a designer, and really getting into, you know, datavis and UI design and all this, and, frankly, I mean, it’s like, it’s… where I spend a lot of my time when I’m doing consulting work, even—a lot of times the problems are bigger than the surface ink, as I’m sure we’re going to probably go into this when we think about inking the screens and all of that. And that very much is an important part. It’s just, it’s the visual part that especially non-designers can see. It’s the place where we start to feel the pain and the problem surface themselves because they’re very visual.

And I’m kind of curious, since you’ve been doing this kind of work, are we still solving the same problems that we were five years ago? Has anything changed? What’s it like, the course of your work right now with your clients? Are you still solving the same kinds of challenges, and they’re like evergreen problems that don’t go away, or is it evolving? Like, what’s the landscape of design knowledge like, particularly for, I’m assuming, analytics teams, or are most of your clients, maybe, data science teams? Talk to me a little bit about that.

Thabata: You’ll find all sorts of levels. So, working with data, and especially data design, is sort of a journey. So, there’s different stages of maturity in which you can find a client or a company. So, I find that it’s not exactly an evolution or a linear thing, but it’s more that different clients are at different stages of their journey. I do find that a lot of the problems that I work with are still the very same problems, maybe they get different names, maybe they get a different face, but at the core, it’s always still the same types of things that we’re trying to solve.

What I try to solve as a core problem most of the time is how do we communicate data in a way that people can do more with it, so that people can actually do something with it. At a very primary stage, this is what everybody wants to do. Either they are at the very beginning of their journey, and they have heaps of data, and they have no idea how to use it or what to do with it, and then visualization can come as a point of contact for people who are not familiar with the inner works of technology and all of that, to help them use that data. Or if they’re more advanced into the weeds of how data works, and they already have a process in place, and they use it for decision-making, how they can make it better? How they can make it so it’s more accessible, that it’s easier to use, that they can actually embed it as part of their day-to-day?

So, the struggle is basically that, and it’s basically the same everywhere you go. It changes a little bit in sense of which technologies people use, or how much emphasis they give on technology as opposed to process, or how much they are willing to spend, or to explore, or time. Those types of things change, but the core problem is still the same, and that’s how to communicate, see, and do more with data, really. Yeah.

Brian: So, if we think about the, let’s call them the more seasoned clients that are perhaps ones that are a little bit further along, which I’m guessing most of the people listening to this show are familiar with the concept of design, even if they haven’t worked with designers—like, capital D designers—before, they at least know what that discipline is, hopefully, by now. If you haven’t, there’s 140 episodes before this that you can listen to [laugh]—

Thabata: [laugh].

Brian: Before that, but—

Thabata: I’m sure they’ll catch up very quickly. Yeah.

Brian: But you know, joking aside, do you feel like when you’re working with a more advanced team that they are design knowledgeable in the sense of, we actually know we have a design. We either have a UI or UX or a both problem. We know that that domain of expertise exists, and we want to get help with it. So, they come in as a knowl—like, you call it a knowledgeable buyer, I guess, you’d say from a sales perspective, right? An informed buyer versus someone who’s like, “I have all these problems. I didn’t even know there was a thing called design that exists as a discipline, let alone that it applies to analytics, and let alone that you particularly exist to help me.”

What’s it like, if you’re dealing with a more advanced client? Are they fairly knowledgeable about the work that you’re going to do and the gaps that they have, and they just don’t know how to cross the gap? Or is it more like they don’t even—they’re still kind of early on in even understanding what their problem is?

Thabata: I think there’s a little bit of a misconception about design. It’s not so much that people—even the more seasoned clients—don’t know what design is, it’s more that they understand it differently from what it can actually do. So, for example, companies that have been using data for a long time, and they are more seasoned, and they know they have a problem, they usually come to me saying, “Okay, I have an analytics problem to solve: I have these requirements, I need to answer these questions, I need to have these types of visuals, or I need to have this type of technology.” And it’s always very tied to the technology side of things, so it’s a tool, it’s a tech stack, or it’s something like that, that they’re looking for. And when I mention the word design, I feel a little bit of a pushback.

So, I think this is rooted mainly in the fact that data has been very tied to IT teams, to technology teams, and they’re not always up into what design actually does. And then you often associate design with making things pretty [laugh]. “So, you make pretty charts?” This is the first thing I hear whenever I tell somebody that I do information design, or that I do that data visualization design. And the answer is no, design is about solving problems first.

So, it’s that idea that function follows form, or that function and form have to coexist into an equilibrium or into a balance. But they don’t see it that way. Most of the time, whenever a client comes to me, I initially try to understand what their problem is, very often it is either a user interface or user experience design problem—very often user experience design problems—but I don’t always refer to it that way, “Oh, yeah, you have a design problem,” because the first thing they have in mind is, “Oh, no, you’re just going to make everything pretty, but what about all of these other things that I have to do? What about the problems that I actually have to solve? What about the decision-making process? What about what do I do with this data, if I have to use it on a meeting,” and blah, blah, blah.

And they get stuck on the visual part, but design is much more than that. Design is also about function. If you strip design of function, it becomes art [laugh]. So, it’s not art, it’s not about being pretty, it’s about being functional and being useful in helping people along that journey. But it’s not that people don’t know, really, that the word design exists, or that design applies to analytics and whatnot; it’s more that they have this misunderstanding that it’s about making things look a certain way, when in fact, whenever I come—and often clients get surprised that how many questions I ask, and how many documents I write, and how many things I actually tell them about. It’s about function. It’s about helping people do stuff better.

Brian: Yeah.

Thabata: Yeah.

Brian: I can relate to all these things. And I think it’s normal. It’s not that design is special because I think it’s the same thing when, like, you know, you have an executive, that’s like, “We need an AI strategy.” And it’s like, they don’t even know what they want to do, and they’re talking to the data science team. And inside, the data scientist is kind of rolling their eyes, but they’re also [realizing 00:10:04], all right, this is a bid for a conversation, and we’re starting out at this relative knowledge.

And I think this has chan—I mean, I think a lot of teams are much more informed, you know, executive teams are more informed now about what that capability is. But like, do you need an analyst? Do you need a data scientist? It’s kind of like talking to you and I about design, and so we have to work people on the journey and keep focusing on the problem space, and maybe we change the language we use. And I kind of felt like—I sort of admitted this on that one of the DPLC, the community, conversations we were having, we were talking about communication, and I think we were talking about how these data product teams like data governance, and privacy, and there’s all these facets of stuff that falls under data product management, but a lot of that stuff is not—that language shouldn’t usually be used with stakeholders. It’s just, it’s too much detail about domain stuff that’s not relevant to whether or not they’re going to get value out of something.

So, I’m kind of with you, which is, I changed my language, depending on who I’m talking to focus on the problem, like you said. Like, focus on the change that needs to happen. And design needs to be functional. It’s like, let’s just stop—we’ll just stop calling it design. We have a problem here, which is, you know, Susan in finance—as we always use on the show—Susan in finance needs to make a decision about how much money to allocate to each department for bonuses, and she doesn’t know how to divvy up the pie. So, let’s just focus on how do we help Susan and stop using the D word [laugh], you know?

Thabata: Yeah, exactly. Yeah.

Brian: And you take the team through the process, and the next thing you know, they’re doing design—they don’t know it—and then hopefully the conversion starts to happen. I don’t know. Is that how you, kind of, see it? And then slowly over time, they start to see what you’re doing, and oh, like, this makes a lot of sense. Like, is that kind of how it is for you?

Thabata: Yeah.

Brian: Yeah.

Thabata: Yeah, no, absolutely like that. Yeah, and I like how you refer to it as the D word.

Brian: Yeah [laugh].

Thabata: [crosstalk 00:11:53] D word. No but, it’s exactly like that. Sometimes you have to bring them along a little bit, so you have to nudge people into the right direction with whatever they have as the point of reference, right? It’s like, when you’re teaching someone a new concept, and they have no idea what you’re talking about, you try to borrow from analogies that they understand. So, if someone from finance only understands finance and accounting and that’s her world, that’s where she lives, that’s what she does, I’m not there to tell her that, you know, she’s been using the term design wrong, or I’m not going to get there preaching. You know, and like, “Oh, you don’t know what design is.” It’s kind of annoying. I’m not going to do that [laugh].

So, it’s more getting, okay, you have this process, you have this problem, you have to do this report. How many times a week you do it, why do you do it? Do you even need to do it? Does it have to be in this format? With a different format help you better? If so, how can we achieve that? Is there anything missing?

Is there anything that you love about the current report that you’re afraid of missing? And this is something that a lot of people don’t ask. You’re so concerned with the change and with changing things and making things happen that you don’t ask what’s currently working that people might be afraid of losing if everything changes. So, this is a good starting point, like, “Okay, tell me a little bit about what you actually like about what you do and how data is currently helping you.” And Susan from finance is probably going to have lots of opinions about that.

And then she’s going to start saying, “Well, from the things I don’t like, there’s this, there’s this, and that,” or, “This is too hard,” or, “This is taking too long,” or, “This has too many steps,” or, “Every week, I spend two or three days wrangling a report just because it comes in a dashboard form, but I actually need to use it as a PowerPoint presentation for my boss.” And those are all pain points you can map on a person’s journey, and then go back to the drawing board and then start to design thing. So okay, now I understand what your problem is, I understand where your context is, I understand what your purpose is, I can start putting design into it. I can actually start going to the drawing board and saying, “These are possible solutions to that problem,” and then start iterating and working together with them. But never really going back to them and saying, “Oh, yeah. This is this what design is now?” [laugh].

Brian: Right, right, right?

Thabata: Yeah, yeah.

Brian: You mentioned the word journey, and you sort of outlined some of the steps that we might do in this step called Journey Mapping, which I think a lot of people listening this show have probably had some exposure to this before. One of the things that goes into a Journey Map is data [laugh]. And where did the data come from? And then you talked about Susan and finance. So, talk to me about how do we collect the data that goes into a Journey Map, and is it hard to get access to do that data, which is, you got to talk to Susan.

Do you have issues with getting access to—or do your cust—your clients, do they have challenges getting access to the Susans of the world such that they can do this work? Or is more they have the access, and there’s resistance to doing it because either they don’t like doing this kind of work, they don’t know what to do with the information, Susan says she doesn’t have time, another gatekeeper says we don’t have time for this, just do the requirements. You’re shaking your head [laugh]. Is it all of the above?

Thabata: Yeah, a little bit of column a, a little bit of column b. No, yeah, a little bit of everything. Again, it depends where on the data journey people are. If you’re talking about a big company, usually you have lots of siloed teams, and when I’m brought along, this is one of the first struggles in Journey Mapping. Journey Mapping means that you have to talk to people.

Again, there’s this idea that I think because most data work comes from the technology side, and when you’re coming from an IT, traditional IT team, say ten years ago, it was made of tickets and forms that you fill up, and you put your requirement forms, and you never really see a person, and never really talk to anybody. And that got sort of translated into the data world. But data is an inherently human thing, and I always say that. It is a human artifact, it is something that we create ourselves. We collect information about things that we deem important. So, it’s biased from the go, from the get-go.

There’s always a person involved in that process. You can’t fully remove the human from the data; it’s always there. So, talking to people with essentials, understanding what people do with this data, what they need to solve, what kinds of problems they have, you have to get out there and talk to people. Whenever I say that when analytics team, they go, “Ah, no, but I am a developer. I am a scientist, or I am an engineer, I don’t talk to people.” [laugh].

So, there’s that element of resistance, but there’s also lots of gatekeeping. If I find a company that it’s a large company, and they have lots of little data silos and there’s not a centralized function for data, there’s usually going to be that little power dynamics where, oh, no, this is my data. This is my team’s data. We are the ones that hold the power of making reports. Nobody else does it. So, if we go out there, and we open up too much, and we talk to people too much, they’re going to want too many things. And we can’t deliver everything, so we don’t. So, that there’s a little bit of that paralysis.

And there’s always the type of company where you have a project manager or a product manager, and they feel like they are the only ones who should talk to the Susans, and then they bringing that information to you. Which is fine to a certain extent because it makes people feel like it’s more efficient, but it isn’t, not really. The people on the technology side who are actually building the stuff need to have contact with how people use things. Because there’s lots of new ones. There’s lots of little things that are not always captured by one person’s view. And so, what I would say is that, you have to go and talk to Susan, have to find a way to Susan. It can be, sort of, a difficult battle. It can be something that’s quite challenging to do, but you have to go and find a way to talk to Susan, really [laugh].

Brian: Yeah. People on this show have heard—I mean, you sound like me on many episodes of this show. Can you give listeners a story or an anecdote about, well, what is it that we might learn from Susan that’s not going to come across if a gatekeeper or a business analyst, like, a requirements gather or person just passes it along to the team secondhand? What might we learn from Susan that we just wouldn’t know we didn’t even know until we went and actually directly talked to her? Do you have any, like, anecdotes you can share or a story?

Thabata: Yeah, sure, sure. So, whenever I’m asked this type of question, there’s three things that are very important to define before you start doing anything data. I always say that the biggest part of your data product success—is it a dashboard, a report, whatever it is—happens outside of your technology and outside of your actual analysis. It’s defining who your audience is, what the context of this audience is, and to which purpose do they need that product. So context, audience, purpose. Those are the three pillars of everything that I do.

If you—third-party, it’s just somebody else. So, somebody else goes and talks to Susan and grabs all the requirements for you, and then you just see what they saw. It’s that Chinese whispers type of thing. So, lots of things get lost in translation. They may be missing a lot of the nuance on the context of where and how and when Susan will need to use her data, or in what context does she need to make a decision, or in what context—what triggers her to go into data in the first place?

And a lot of the purpose may be lost as well because the purpose is not just a big statement. It’s not just saying, “I need this data because my boss asked me to.” [laugh].

Brian: Right.

Thabata: It’s, “I need this data for a variety of reasons.” It can be to make a decision, to understand something, to explore something, it can be all of the above. But we tend to narrow it down to just the main stuff, especially for telling this to somebody else. So, I think the main game is seeing those tiny little things and those tiny little triggers that make all the difference into making something that will seamlessly integrate into Susan’s workflow instead of kind of forcing Susan to adapt to a tool that wasn’t really meant for her.

So, as an anedote—anecdote—I can ever say that word right [laugh].

Brian: [laugh] [crosstalk 00:20:52].

Thabata: Little tale. For example, if I am developing something, I am a developer, I work with a tool—say Tableau, Power BI, whatever—I make dashboards. I am in my little world and I receive a requirement. And the requirements says, “I need to update finance report weekly so that the,”—this is a classic user story, right, “Susan from finance needs to update a report weekly, so that she can update her team on status and if they’re tracking towards their financial goals.” Mmm. User story, right?

So, I see that, and I’m going to think well, what kinds of charts does she need? What kinds of tables does she need? What type of data does she need? I know which KPIs she may need to do that. I know that she might need a target, but so what? So, what’s missing there is the ‘so what?’ So, I go and talk to Susan.

“Okay, I have this user story here that says you need to present this to your boss every week, or to your team every week to track targets. Why are you tracking this targets?” “Oh, we’re tracking these targets because we need to understand where we are at in the year, so then we have to pull some triggers or pull some levers and make some decisions to either increase sales or decrease costs somewhere, or we need to understand what’s behind that metric.” Which means that whenever she gets that metric that that one requirement told me that she needed, she will probably explore more. So, she will look at the main metric, look at the target, but now she’s talking about pulling levers and talking about how she needs to change something for cost or something for pricing or something for sales, just so decisions are made so that they can hit that target.

So, it’s not just seeing that KPI against a target, which is what I would have gotten from that user story, right? She’s going to look at that she’s going to say, “Okay, we’re not on target. Why not?” And then she’s going to want to try to figure out why, after the why, she’s going to go, “Okay, but how can I move this? How can I make this change? How can I move this metric?” And then she will need to understand what the metric is made of, so she’ll probably want to—the dreaded thing of everybody who makes a dashboard: download it to Excel so that you can see what’s in the background.

Brian: Nooo, don’t click the down arrow [laugh].

Thabata: Yes. And then she’s going to look at the downloaded file into Excel, and the dashboard developer’s going to be super disappointed and heartbroken because they spent a week developing this for her.

Brian: That’s right.

Thabata: And she’s not using it. But you know, it’s because they missed all of that there was a background of what her journey was with that piece of data. So, if you’re Journey Mapping, and you see her, so you sit beside Susan, and you tell her, “Okay, this is Monday, your week just starting. What do you do?” And then she’s going to say, “Okay, I go into this dashboard, I look at this metric, and this target. And then I get confused because I don’t know what’s in the background of it, so I extract it to Excel.” [laugh]. “And then with Excel in the spreadsheets, I explore what the metric is made off. I can see that there’s a sales component in it. I need to understand more about sales, so I pick up the phone. I call the sales manager, and the sales manager tells me what his numbers are about.”

And there’s this whole process that goes beyond just the first screen she got in touch with that could be either other screens, or it could be more reports, or it could be just this one report, but curated for her use case with all this information already there, just such she wouldn’t spend a whole day hunting for it.

Brian: Yeah.

Thabata: And this is the type of thing that unless you’re a very skilled business analyst or very skilled project manager, you’re not going to do. Most people are just going to go to Susan and say, “What do you need? And she’s going to say I need a financial report against targets.” Because she also probably—she’s so used to having all those steps in her day-to-day, she’s probably not even going to name them as something that could change. It’s just, “Yeah, they’re going to give me a report. It’s not going to be enough. I’m going to go extract it to Excel, look what’s behind it, call the people who are responsible for that part, and then understand it more.”

And she just doesn’t feel like that’s something that’s worth talking about because a hard first contact is that one metric with that one target. So, I think there’s a lot of value in sitting with people and seeing how they work. And this is where user mapping answers, really.

Brian: Yeah.

Thabata: Yeah, you asked about collecting data, and this is exactly what it is. So, what it looks like is, I often call people into a workshop, and I tell them, well show me what you do with the report. And then they will tell me what they do. So, sometimes it goes beyond that one report, and they’re going to end up opening other [team 00:25:34] reports. And then suddenly, I get back to the team with all this information, going like, “Okay, Susan looks into ten different spreadsheets before she can make this one decision.” And they will be surprised. Like, the data analyst team is like, “I don’t even know where the other nine spreadsheets come from.” [laugh]. And yeah, so that’s where the real journey begins.

Brian: Yeah. I think there’s always value in keeping—a tight, empowered product team should have regular exposure to end customers; I think there’s nothing that replaces that. I do think you can have a highly-trained user experience researcher—and I’m talking about someone that’s actually trained in this. It’s like, oh, anybody can ask questions. Like, yep, anybody can ask questions. That’s right, but knowing what to look for, how to manage a conversation with a user, you’re also thinking about emotions and feelings, and the attitudes and stuff that you never see written down in a requirement document.

Like, well, here’s what Susan needs, but did you know Susan is pissed off. Susan hates doing this stuff, but no one wrote that down in the requirements because like, that’s nice. That’s her opinion. But that doesn’t tell the developers what to build, so we’ll leave that out. This is the difference I think between when we think about experience design research and requirements gathering, they’re very different to me.

And our job is not to give people what they ask for, but we also don’t want to tell them that we’re not going to give you what you asked for. I’m going to take all your requests in because there may be nuggets of really useful stuff in there, but they’re just ideas, they’re not commandments, they’re not, we’re definitely going to go give you that Excel download button. That signal tells me that the design isn’t doing a good job for you, otherwise, you probably wouldn’t be asking for that, so we have to interpret that stuff, usefully.

And there are people that can do this—and I’m kind of talking to our listeners—there are people out there that specialize in this type of work. They are called user experience researchers, and they can be invaluable in your team. I like everybody doing research, but if you don’t know how to do it, and you don’t know what you don’t know about it, that’s a good type of skill set to bring into your team if you don’t already have it. Off my soapbox, [laugh] for a second.

Are there a handful of regularly occurring problems that your clients come in the door with you? So, maybe they’ve made that decision to work with you, they’re ready to take a step into the design world, or maybe they’re experienced, and they’re back for more. Are there things that analytics leader—I’m assuming it’s a manager or somebody, and they have maybe a team of analysts or data scientists that work with you—correct me if that’s wrong—but are there things that these people need to unlearn in order to be able to soak up the knowledge that you want them to take? And I’m asking this because I want to help listeners understand, like, okay, my gut feeling may be X, but I need to learn to let go of Y in order to really take that journey here. And maybe if they hear that this is what other people have done, it’s easier for them to take that step. So, any thoughts there? Are there repeating problems that you see in their approaches or things they need to let go of, or changes they need to make?

Thabata: I’m not so sure if there’s repeating problems, I think, in the way you’re asking, but there are certainly patterns of behaviors that I see. So, one of them is considering that, taking what people say at face value. So, that’s one thing. So, teams that want to start this, you know, user mapping better or understanding better what people need, they would have to, first and foremost, let go of the idea that whatever people tell you first is what they actually need because it is not [laugh]. As you said, there’s a lot of feelings involved, sometimes people feel like—one of the good things of bringing somebody like me who’s an external person is that I don’t bring with me all the company politics.

So, a lot of the time whenever I go into a new team and I start talking to the end-users, I hear a lot of not exactly complaints, but a lot of frustration in the sense that people ask for something, and then they get delivered that they get delivered that literally exactly what they asked for. And then you have two sides that are frustrated with the same thing. The side who asked wasn’t really sure what to ask. They thought that they couldn’t ask for more. They thought that if they asked for a lot of things that will take a long time to develop. They didn’t trust the data. They didn’t trust the team. They didn’t trust anything in the process, so they kind of asked for the first thing that they thought would be delivered to be asked if they asked.

And then the data team gets frustrated because the users don’t know what they want, they are always changing their minds, they ask Thing A, and then they ask for A plus B, and then they ask for A plus B plus C plus D, and then it never ends. And there’s always one thing else. So—

Brian: “Can I customize it?”

Thabata: Yeah, exactly. And there’s always something else, and there’s always something else, and why can’t I have all of these things? And the user on the other side is saying, “No, but I didn’t ask for all I need. I am asking little by little because I am not too sure if they can do it, and if they can, how long it’s going to take.”

Brian: Yeah.

Thabata: So, both sides kind of are to blame in that because the in-between communication is saying, basically, nobody really understood what the problem is [laugh]. You’re just talking about requirements, and what I think I can ask what I think I should ask. And the other side is wondering, well, this is what I think they want. And there’s a lot of assumptions happening there. And this is one of the hardest things to let go.

Stop the assumption-making, and if you don’t fully understand something, or if you’re not really sure of what the problem you’re trying to solve with requirement is, keep asking until you do. And I know it sounds annoying, but just ask more and more and more. This is of course, as you say, an experienced researcher will know how to nudge people into that, into seeing what the actual problem is beyond what they think is feasible, but the analytics team can go there and keep asking questions, and developing alongside people.

So, this is the second thing that I think is a bad habit is communication only happens in strict moments, and in some of these distinct. So, there’s this moment where I am going to ask for requirements as an analyst, and then I go and ask for requirements. And then there’s this one moment where I have a chance to ask for everything I want, and then the person goes and asks for everything they want. But there’s never in-between checkings. So, each one goes the other way—their own way.

So, the analytics team goes and develops something, and then once that’s ready, or mostly ready, they show, or they present to the end-user. And then the end user goes, like, “Yeah, but you know, in this meantime, actually, all these other things popped up, and my context changed, and the purpose is no longer that one, and something else happened. And you took three months to give me this. Now, it’s gone. I don’t need it anymore.” [laugh].

And this is terrible, so keep it in constant communication. “Okay, we’re going in this direction. Is this still the direction you want to go? If we make these solution, is the solution going to actually solve the problem you have?” And a lot of the time in these smaller checkings in between, the users are going to refine those requirements. They’re going to go, “Ah, so this is what you got out of what I said. Well, actually, it’s not that. It’s something else?” Or, “Yeah, you’re in the right direction, but could you add this, this, and that?” And it’s way less frustrating if you do it in small doses along the way than if you wait three months to get it all at once.

So, there’s an exercise that I do whenever I run workshops about this. So, if a client comes to me, and they say, “Okay, talk to my users and gather requirements from them,” what I usually do is I book time with people, I sit with them once, twice, three times, how many times it’s necessary, and one of the first exercises I do is I ask them to tell me, in terms of using data in their job, what is something that they like, what is something that they wish was better, and what is something that they wonder if could be done or not? Because those are the three things—this is a very common exercise in user experience; this is called ‘I like, I wish, I wonder,’—tell me what you like, about what you do about the things you use, the tools you have, the reports you use every day, how they help you, and then you can understand what’s working, what you wish could be better—and this is a way of framing the what you don’t like but in a positive light—and also gather requirements a little bit, so what you wish you had. So, from the things you like, you like this, but what’s missing? And then these are kind of my wish list.

And I always tell them, what’s your moonshot? Don’t hold back, don’t think that it can be delivered. Maybe it can, maybe can’t, maybe it’s the second phase, third phase, whatever, but I want you to tell me, if there were no limits, what do you want your data to do? How do you want to use it? So, that enters into the ‘I wish?’

And then later in the end, there’s the ‘I wonder,’ so I wonder goes into more of the feeling part, and goes into more the things that people don’t know that they don’t know. So, I wonder if I could mash up this data set with this other data set because that would give me a different overview of something else, of a particular process that I don’t know. I wonder if I could sit in a different meeting to see what they are doing with their data and deciding there. I wonder if I could see what you guys are up to in relation to this development or that development, if I could get more updates about that? I wonder—many things. I wonder what do you mean when you say AI [laugh]? Is it coming from my job or something?

And all those things pop up in the ‘I wonder’ part. It is an easy enough exercise that almost anybody can do, and it opens up the channel for people to say things they like, things that are not working, and things that they’re not sure is this even something that you would do? And then you go back to the analytics team, and you sort what was mentioned there. So, takes heaps of—you take heaps, heaps, heaps of notes during these conversations. And then you sort out, go, like, “Okay, lots of people that I talked about and did this exercise mentioned that they really liked this one report. What is this one report doing right? Why is this one such a hit?”

“And lots of people wish this other thing was better, and these are the points that they wish were better here? Okay, what can we do to improve these, then?” “And lots of people have questions about these other things that are marginally related, but could we incorporate it somehow into our data product that they don’t have so many of those questions? Or could we help them through those problems?” And it’s a magical exercise, really. It’s really a good starting point because you get to know your audience in a deeper way than just asking them what they need. Yeah, so.

Brian: I love the examples that you gave there, and the way we ask those questions with UX language, but there’s an intention behind the framing for those. And they’re open-ended questions intended to get people to open up, and give you data. Effectively, they’re giving you great qualitative data about where the actual needs are, so I think that’s great. It would be impossible not to mention your manifesto. You have a manifesto.

Thabata: I do [laugh].

Brian: There are 12 commandments. Are they commandments?

Thabata: Uh—

Brian: What are they [laugh]?

Thabata: Values. I call them principles.

Brian: Principles. I’m not going to make you state all 12 because we’ll put a link. Do you have a quick vanity URL, or is it pretty long. We’ll put it in the [notes 00:37:05] anyways, but—

Thabata: If you just go to my Behance, that’s @DataRockNZ, and it’s on my front page on my website as well, if you just want to check it out there.

Brian: Got it.

Thabata: Yeah, yeah.

Brian: Excellent. Tell me, yeah, I wanted to know, are there two, three principles from your data design manifesto that you can share here that you think might be most relevant to this audience of people that are trying to build data products the product-y way, they’re trying to do this work—that, you know, machine learning, and AI, and analytics work—using product and user-centered design principles, are there a couple that stand out for you as the most important ones?

Thabata: Yeah. Well, they are all a big Venn diagram, right? They all kind of come together, but I think it’s those three things—so I would name the purposes that are about audience, purpose, and context. So, the first one, I think, is ‘good data design is purposeful,’ which is saying that every decision you make when you’re creating a visualization or a report needs to have a good purpose, needs to have a reason behind it, just understand why you’re doing this. And because my boss asked you is not really a purpose. That’s why it’s not just the reasoning behind it; it’s more what are you trying to solve in terms of a problem? I think that’s the first one, and this is the first one at the top. So, that’s why.

Another one that, I think, is maybe good one is the context one. I think it’s principle number five, if I’m not mistaken. So, ‘good data design is context-sensitive,’ which means that data exists within a context. It was collected because of an event or reason within a given set of guidelines or a given set of assumptions, and you should never lose sight of those. And it’s going to be used in a context as well. So, where are people going to use this and why? No piece of data product exists in isolation, so understand what people do with it, and where and when and how is really important.

I think, what else? I would say that maybe the one about collaboration, I think. It’s ‘good data design is collaborative.’ So, you always have to be collaborating, not just with your peers in your data analytics team, but with your users, and with everybody who’s making decisions out of it down the line. Your data will flow, and that collaboration is what brings value and is what unlocks the real value of the data. If somebody knows how to analyze it in finding sides, they tell it to somebody else. That somebody else is going to make a decision, that decision affects somebody else, so there’s people all over it.

Which reminds me of another one that I think is, ‘good data design is humanistic.’ Because data is about people. It’s not about the tools. Also, none of my principles mention any tools. Good data design is not about the tool you’re using because tools change, tools come and go. The tech stack can be one today, other one tomorrow. In my years of career have I’ve used at least, like, five I have different tools to achieve what I do. And the level of quality and what defined good or not is the people involved in it and not the tool you’re using. So yeah, I would mention those, probably. Yeah.

Brian: I think those are great. The collaboration one I think is one of the most important ones. Another framing that I’ve used on the show that’s not mine, and I wish I could credit the woman who I think said this, but it’s this idea of designing with users and not designing for them. So, thinking of them as another colleague, another employee, another domain, a department, almost like they’re another department. Well, of course, we’d have engineering. Of course, it needs to know about this project. Of course, the users are on the team with us making it. We’re making it with them, not for them. So, I agree that’s definitely a notable one worth mentioning. So, this is great stuff. If people wanted to check out your work or get a copy of the design manifesto, or just kind of learn more from your perspective, where do they go? Like, where are you on the interwebs?

Thabata: The first place you can find everything is my website. So, that’s datarocks.co.nz. There, you can find the shop, you can subscribe to the newsletter. I run a monthly newsletter called Design Matters that’s more focused on the user experience side of things, but I often do things about user interface and data visualization pieces as well. It’s a bit long form, so if you like to read a lot [laugh]—

Brian: Okay.

Thabata: —that’s where you go. And I am on social media as well. I post often on LinkedIn, mainly. I am also on BlueSky, Mastodon. I quit Twitter, so I am not there. My handle is there, but I rarely check it, so just check all the other ones [laugh].

Brian: Sounds good. I think a lot of this crowd is on LinkedIn. And we’re connected there, too, [so yes, please look up 00:41:47]. T., it’s been great to talk with you and I look forward to staying in touch. And I’m always happy to know about great designers that are working in the data product space, and trying to help this community move things forward with building stuff that matters.

Thabata: Yeah, thanks for having me. I hope that I was able to share something that was interesting, helpful to somebody out there. Yeah, just ping me on social media or through my website. I’m always happy to get in touch with anybody for any reason whatsoever. Yeah.

Brian: Cool. Sounds good. Well, take care.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe for Podcast Updates

Join my DFA Insights mailing list to get weekly insights on creating human-centered data products, special offers on my training courses and seminars, and one-page briefs about each new episode of #ExperiencingData.