063 – Beyond Compliance: Designing Data Products With Data Privacy As a UX Benefit with The Data Diva (Debbie Reynolds)

Experiencing Data with Brian T. O'Neill
Experiencing Data with Brian T. O'Neill
063 - Beyond Compliance: Designing Data Products With Data Privacy As a UX Benefit with The Data Diva (Debbie Reynolds)
/

 

Debbie Reynolds is known as “The Data Diva” — and for good reason.

In addition to being founder, CEO and chief data privacy officer of her own successful consulting firm, Debbie has been named to the Global Top 20 CyberRisk Communicators by The European Risk Policy Institute in 2020. She’s also written a few books, such as The GDPR Challenge: Privacy, Technology, and Compliance In An Age of Accelerating Change; as well as articles for other publications.

If you are building data products, especially customer-facing software, you’ll want to tune into this episode. Debbie and Ihad an awesome discussion about data privacy from the lens of user experience instead of the typical angle we are all used to: legal compliance. While collecting user data can enable better user experiences, we can also break a customer’s trust if we don’t request access properly.

In our chat, we covered:

  • 'Humans are using your product': What it means to be a 'data steward' when building software. (0:27)
  • 'Privacy by design': The importance for software creators to think about privacy throughout the entire product creation process. (4:32)
  • The different laws (and lack thereof) regarding data privacy — and the importance to think about a product's potential harm during the design process. (6:58)
  • The importance of having 'diversity at all levels' when building data products. (16:41)
  • The role of transparency in data collection. (19:41)
  • Fostering a positive and collaborative relationship between a product or service’s designers, product owners, and legal compliance experts. (24:55)
  • The future of data monetization and how it relates to privacy. (29:18)

Resources and Links:

Quotes from Today’s Episode

When it comes to your product, humans are using it. Regardless of whether the users are internal or external — what I tell people is to put themselves in the shoes of someone who’s using this and think about what you would want to have done with your information or with your rights. Putting it in that context, I think, helps people think and get out of their head about it. Obviously there’s a lot of skill and a lot of experience that it takes to build these products and think about them in technical ways. But I also try to tell people that when you’re dealing with data and you’re building products, you’re a data steward. The data belongs to someone else, and you’re holding it for them, or you’re allowing them to either have access to it or leverage it in some way. So, think about yourself and what you would think you would want done with your information. - Debbie (3:28)

Privacy by design is looking at the fundamental levels of how people are creating things, and having them think about privacy as they’re doing that creation. When that happens, then privacy is not a difficult thing at the end. Privacy really isn’t something you could tack on at the end of something; it’s something that becomes harder if it’s not baked in. So, being able to think about those things throughout the process makes it easier. We’re seeing situations now where consumers are starting to vote with their feet — if they feel like a tool or a process isn’t respecting their privacy rights, they want to be able to choose other things. So, I think that’s just the way of the world. .... It may be a situation where you’re going to lose customers or market share if you’re not thinking about the rights of individuals. - Debbie (5:20)

I think diversity at all levels is important when it comes to data privacy, such as diversity in skill sets, points of view, and regional differences. … I think people in the EU — because privacy is a fundamental human right — feel about it differently than we do here in the US where our privacy rights don’t really kick in unless it’s a transaction. ...  The parallel I say is that people in Europe feel about privacy like we feel about freedom of speech here — it’s just very deeply ingrained in the way that they do things. And a lot of the time, when we’re building products, we don’t want to be collecting data or doing something in ways that would harm the way people feel about your product. So, you definitely have to be respectful of those different kinds of regimes and the way they handle data. … I’ll give you a biased example that someone had showed me, which was really interesting. There was a soap dispenser that was created where you put your hand underneath and then the soap comes out. It’s supposed to be a motion detection thing. And this particular one would not work on people of color. I guess whatever sensor they created, it didn’t have that color in the spectrum of what they thought would be used for detection or whatever. And so those are problems that happen a lot if you don’t have diverse people looking at these products. Because you — as a person that is creating products — you really want the most people possible to be able to use your products. I think there is an imperative on the economic side to make sure these products can work for everyone. - Debbie (17:31)

Transparency is the wave of the future, I think, because so many privacy laws have it. Almost any privacy law you think of has transparency in it, some way, shape, or form. So, if you’re not trying to be transparent with the people that you’re dealing with, or potential customers, you’re going to end up in trouble. - Debbie (24:35)

In my experience, while I worked with lawyers in the digital product design space — and it was heaviest when I worked at a financial institution — I watched how the legal and risk department basically crippled stuff constantly. And I say “cripple” because the feeling that I got was there’s a line between adhering to the law and then also—some of this is a gray area, like disclosure. Or, if we show this chart that has this information, is that construed as advice? I understand there’s a lot of legal regulation there. My feeling was, there’s got to be a better way for compliance departments and lawyers that genuinely want to do the right thing in their work to understand how to work with product design, digital design teams, especially ones using data in interesting ways. How do you work with compliance and legal when we’re designing digital products that use data so that it’s a team effort, and it’s not just like, “I’m going to cover every last edge because that’s what I’m here to do is to stop anything that could potentially get us sued.” There is a cost to that. There’s an innovation cost to that. It’s easier, though, to look at the lawyer and say, “Well, I guess they know the law better, so they’re always going to win that argument.” I think there’s a potential risk there. - Brain (25:01)

Trust is so important. A lot of times in our space, we think about it with machine learning, and AI, and trusting the model predictions and all this kind of stuff, but trust is a brand attribute as well and it’s part of the reason I think design is important because the designers tend to be the most empathetic and user-centered of the bunch. That’s what we’re often there to do is to keep that part in check because we can do almost anything these days with the tech and the data, and some of it’s like, “Should we do this?” And if we do do it, how do we do it so we’re on brand, and the trust is built, and all these other factors go into that user experience. - Brian (34:21)

Transcript

Brian: Welcome back everyone to Experiencing Data. This is Brian T. O’Neill, and I’ve got the Data Diva on the line with me today. It’s Debbie Reynolds. What’s up? And—

Debbie: [laugh].

Debbie: —where did you get that name? And tell me about the Data Diva.

Debbie: Well, first of all, thank you for having me on your show. This is going to be so much fun. We had a great time, actually, in our pre-call, so I’m happy to talk today. But Data Diva actually was a nickname that someone had given me at a networking event I went to. So, we were doing a workshop on branding, and we had to break off in groups and give elevator speeches.

So, one of the women that was facilitating is a reporter at the Wall Street Journal and I told her my elevator speech, said, “Oh, you’re the Data Diva.” And we just laughed so hard about it. And I was like, “That’s so clever.” And one of my other people was at my group, she said, “Oh, my God, you should totally use that.” I was bashful about using it before because I thought, “Well, maybe people wouldn’t take me seriously.” But I thought it sums up things really interestingly. It’s a lot less boring than my elevator speech. So, I think it sparks conversation; it’s fun to use.

Brian: Yeah, yeah. I have you positioned in my head, associated with one word, and that’s privacy.

Debbie: That’s right.

Brian: So, is that the right word for you? And I think it is, but you tell me. Is that the right—

Debbie: That's the right word, and that makes me so happy because that’s what I want people to think. So, the fact that you think that, that means that I’m doing something right.

Brian: Yeah, yeah. No, you’re doing something right, for sure. So, just the context for this episode is data privacy with—it gets thrown around a lot. The majority of the times that I hear it thrown around, it’s thrown around in a legal context and it’s about compliance. I don’t want to talk about that too much today.

I actually want to talk about how do we design better products where—and you mentioned this in the pre-call—privacy by design. So, is there a way that privacy can become an asset of value to customers, something that we can leverage for greater value for our customers and for the work that we do? So, the first question I wanted to ask you was, when teams—and when I say teams, I’m thinking about digital product, data science, or analytics teams—when we approach data privacy in our work when we’re building out software, should we be thinking about it differently if our users are internal employees and, you know, vendors and it’s like B2B context, versus customer-facing stuff, or that distinction doesn’t matter? Because some people work on internal tooling—if you think about BI and reporting, and model development for pricing algorithms, or fraud detection, or whatever—versus something that directly is seen and interfaced with the customer, is it different, or that distinction doesn’t matter?

Debbie: I think that distinction should not matter because you think your products, humans are using your product. So, regardless of whether they’re internal or external—you know, what I tell people is put yourself in the shoes of someone who’s using this and think about what you would want to have done with your information or with your rights. So, putting it in that context, I think, helps people think about it, you know, get out of their head about it. Because a lot of us, we have… obviously there’s a lot of skill and a lot of experience that it takes to, kind of, build these products and think about them in technical ways. But then we’re also—I try to tell people, when you’re dealing with data when you’re building products, you’re basically, kind of, a data steward.

So, the data belongs to someone else, and you’re holding it for them, or you’re allowing them to either have access to it or leverage it in some way. So, think about yourself and what you would think you would want done with your information.

Brian: Yeah. Yeah. So, tell me about this privacy by design. What does this mean? And am I right summarizing that this is about making privacy almost an asset and not tax on the work that we do?

Debbie: [laugh]. Right. The reason why people think of it as a tax, it’s because they think about privacy after the fact, in a reactionary way. It’s almost like I’m weaving a quilt so… it becomes harder to do after the fact. So, the point about privacy by design, and the reason why I really like that concept is because it’s very different from the way laws are passed in general, where laws are typically a reaction to some harm that happened already.

Privacy by design is like, look, let’s go down to, kind of, the fundamental levels of how people are creating things, and have them think about privacy as they’re doing that creation. So, then it’s not a difficult thing at the end. Because privacy really isn’t something you could tack on at the end of something; it’s something that becomes harder if it’s not baked in. So, being able to think about those things throughout the process makes it easier. And then we’re seeing situations now where consumers are starting to vote with their feet, where they feel like a tool or a process isn’t respecting their privacy rights, they want to be able to choose other things.

So, I think that’s just the way of the world, so it’s not just… oh this is a pain, you know, it may be a situation where you’re going to lose customers, you’re going to lose market share, if you’re not thinking about the rights of individuals. So, I like to work proactively with companies on this. I work a lot with companies on emerging technologies and who are developing tools or software, and I’m advising them so that they’re, again, baking privacy in so that when they have a finished product, or they’re going to market, or something, they don’t have those barriers to adoption, and they don’t have the stopping points or the pain points, that other technology companies or other data science folks reach when they’re trying to reach either internal or external customers.

Brian: Got it. Got it. And I like the way you think of that. And by the way, I just wanted to mention that when I read your bio—you know, you sent me the bio, and we always do a transcript of these recordings—in our pre-call, I was very impressed. All I could imagine was the amount of reading that you must have to do when we think about the internet has no geographic boundaries.

And it’s like, okay’s the product made? What server is it hosted on? Where does the data travel? And it’s like, now you’re in the Europe, and they’ve got GDPR, and now you’re in the States—oh, but you’re in California. I was like, it must be mind-boggling for companies to navigate this space and try to—I mean, is it?

Is it hard to innovate here because especially in a global company, if you’re working with software in particular, it just sounds, like, daunting to me if you want to be careful, but also you want to—where the lines are sometimes gray about this, especially in the marketing space, these lines fe—maybe they’re not great. They feel to me like some of them are like, “Okay, that’s a little sk—why do I see an ad for this thing?” You know, these third-party cookies and all this kind of stuff. Tell me, is this stopping innovation? Or is there just a careful way to do it, and it’s possible to actually create better value here if you just understand the law? And—talk to me about this.

Debbie: Yeah. Wow. Well, this big question isn’t it?

Brian: Yeah. Yeah, a terribly phrased one, too, as a host, but—[laugh].

Debbie: Well, first of all, if you don’t like reading, privacy is not the career for you. There’s tons of reading, tons of research, I now just research constantly, so it’s—I can’t stop. I’m always doing research for something because I know I’m going to need it, so I just flag it somewhere where I read it. But I think there are two different things that are happening. So, I’ll give you an example.

I’m working on a privacy framework with a company called XRSI. They’re a not-for-profit company that creates frameworks for virtual reality, augmented reality, mixed reality products. So, that’s in any type of space. So currently, we’re working on medical or education. But obviously, when people think about virtual reality, they think of gaming and stuff like that.

But I’m working on a compliance part, but what I’m finding is I have to work in parallel tracks. So, one is, here are the laws that apply to certain things that we do. And then the other track is, here’s all the stuff that we have concerns about where there are no laws. [laugh]. So, we’re doing things where there are no regulations, so how do we navigate that path?

So, unfortunately, like I said, the harm comes before the law, a lot of times. So, you have a law just because something bad has happened already. But we’re trying to also prevent things, bad things, from happening. So, being able to walk those two parallel paths is really important. And so, understanding and talking through, you know, what is the potential harm?

We know what the benefit is because you’re able to sell your product, you’re able to get funding and stuff like that. But where do we think the harm can be for people? And not just look at it from a legal perspective; look at it from a human perspective. Would you want your child to wear virtual reality glasses and it forgets that they have to stop at a corner before they walk into the street. So, think about the human things, I think. When you think about it that way, it becomes easier when you’re dealing with products.

I also have a client where they’re working on things like identity passports and identity cards that have health information and stuff like that. So, you know, again, from a human point of view, do you want your kid to have their face in a facial recognition database because they have to get their temperature checks? Probably no, right? [laugh]. So, I’m working a lot with designers of these types of things.

And we’re talking through these things. And this is making it easier for them to actually sell their products because then I’m asking them a question that their buyer or their customer would ask. And we sort of go back and retool. So, it makes the process much easier at the end.

Brian: Even what you just said about, you know, in the virtual space, I don’t even know if the law is clear. Like, if my avatar goes to a virtual store and buys some shoes, can you cookie me and send me—like, was that me or was that my avatar? And is that personal data? I mean, that must be a fascinating legal space to sit in, especially because the law probably hasn’t even thought about defining some of that space, right? The terms and, like, I don’t know—

Debbie: Yeah, it’s—

Brian: —it’s just, we don’t have to go way into that, but it sounds interesting.

Debbie: It’s totally wild. So, I’ll just give a quick example: ambient noise. So, if you’re wearing glasses, they’re obviously taking measurements of your face and your eyes and what you’re doing. From a medical perspective, the way that your hand moves may tell them something about your medical condition or something, but there really aren’t any laws about what happened to the ambient sounds that are picked up by these devices. So, that’s just a wild west type of thing. So, now you have to think, are you recording it? Are you storing it? If you’re storing it, for how long? If you’re—for how long—for what—you know, why are you storing it? All types of stuff like that?

Brian: Sure, sure. Yeah. And then if you have music playing in the background, now you have a copyright infringement. [laugh]. I actually know something about this from my other career, as a musician. There’s really interesting stuff kicks in when there’s music, or copyrighted material in the background, and all this kind of stuff. But we don’t have to get way into that.

Debbie: Yes.

Brian: But another thing I wanted to ask you here was, you mentioned designers, and so I’m going to define a term. We have this thing in the design world, we call them design patterns. A non-designer might think of this as a style guide, but a design pattern is usually more of an interaction. So, an example of one would be, like, if a user tries to log in and it fails, then after three tries, it moves them to the ‘forgot password’ screen and pre-fills their email automatically. That’s a design pattern.

I’m curious if you know of any, or have to work with designers on patterns for collecting data. Because sometimes when we do machine learning and AI, we need to create the data sets. “We need you to drive your car around so we can learn something about your neighborhood and then suggest a routing to you that makes sense based on the way you normally go,” for example. Are there patterns around how we ask for data and how much of it? And especially if the way it’s going to be used is fairly complications and someone’s not going to read a 25-page privacy policy, are their patterns, or just ideas or ways we should think about how we collect data, especially if it’s going to be used in a model, and exposing that to a customer?

Debbie: Yeah. I think that’s a good question. I actually had this question with a company I worked with in Europe. And they do things like identity passports and different identity systems. And one issue we were grappling with was being able to capture someone’s biometric information and data retention.

So, for them, because they deal with products all—they deal with customers all over the world, they have to build their product so they can possibly use, all over the world. So, one thing that we have put into a pattern for this particular product is that the minimum data retention of this information will be 24 hours because of a law in India, and the maximum will be three years because of a law in Illinois. [laugh]. So, the pattern would be basically we would sell the product to a customer, and then they really should be the ones to say, “Well, you can keep your data as long as you want within this time frame, but if you don’t—like, the three-year mark comes up, we’re going to keep reminding you and then we’re going to delete the data, if you haven’t done it already, in three years.” So, that's an example.

Brian: Mm-hm. But that’s after they’ve gotten it, though, so what about if the software that we’re creating requires us to collect at the front end? We can’t give you the service we want, it won’t have the machine enablement, the automation, the prediction, whatever we need because you haven’t granted that. I mean, if I was a designer, I’d be thinking about this progressively: what’s the minimum stuff I can ask that’s the least nuisance? I’d want to make sure the value is communicated clearly before I present that.

I just wondered if you had other patterns you had seen about the ways to do this that aren’t just, like, tiny seven-point text, whole paragraph, and a tiny checkbox, or, like, it defaults to on and people just bypass it. And it’s like, “Okay, great. Now, we can film your home, and your audio, and we know how many breaths per second you’re taking, and”—you know what I mean? It’s like, what are the ways to do this, that maybe enhance the product, and it’s not just a, “Thank God we got you past that screen so we got what we wanted,” the compliance part?

Debbie: Right. Even though there’s no law against giving people a wall of text and 80 pages of privacy stuff, I highly recommend that companies—I work with them very closely on making sure in their customer journey, they get those snippets of information so it’s not so much stuff. But as they’re going through, it’s like a quest, right? Like Lord of the Rings, you go to one thing, and they tell you something, and then you go [laugh] go to the next thing. And so you try to pepper it in there some way, or you design the systems in ways where you can have places where people can park that information and get the consents that they need, or give the information in a way that isn’t so overwhelming.

Brian: You mentioned earlier about asking ourselves how would we feel about this? I kind of wonder if this model doesn’t work because we don’t have enough diversity in the teams to ask the right questions. Because we don’t—we can’t even think about what we don’t know about because we don’t have a diverse enough people asking these questions. Is this relevant at all, do you think? Do you see this in your own thing where it’s like, you go into a client, it’s just like, are you guys all like, [laugh], “Hello? Like, McFly.”

Like, no one’s asking this obvious question that maybe is obvious to you, but they’re not because they’re all living in the same world. We’re all clones, we all walk the same talk, we have to use the same technology, we all have an Echo in our home or whatever it is. Do you see that happening? And can we really rely on what our teams think about how our data should be used? I just—

Debbie: I think you have a—that’s an excellent question, and I think diversity at all levels is important, right? You know, skill set, point of view, regional, you know, there are a lot of regional differences about how people think about this. You know—

Brian: Tell me some.

Debbie: Well—

Brian: I’m fascinated by that.

Debbie: —like, I think people in the EU—because privacy is a fundamental human right—feel about it differently than we do here in the US where our privacy rights don’t really kicking in unless it’s a transaction because is all about trade and stuff like that, where people in Europe, they feel very passionate about it. The parallel I say is that people in Europe feel about privacy like we feel about freedom of speech here. Just very deeply ingrained in the way that they do things. And a lot of times, when we’re building products, you don’t want to be, kind of, flippant, [laugh] or, we don’t want to be collecting data or doing something in ways that would harm the way people feel about your product. So, you definitely have to be respectful of those different kinds of regimes and the way they handle data.

I’ll give you a biased example that someone had showed me, which was really interesting. So, there was a soap dispenser that was created where you put your hand underneath and then the soap comes out. It’s supposed to be a motion detection thing. And this particular one, it would not work on people of color. So, I guess whatever sensor they created, it didn’t have, like, that color in the spectrum of what they thought would be that the detection or whatever.

And so those are problems that happen a lot if you don’t have diverse people looking at these products. Because you, as a person that is creating products, you really want the most people possible to be able to use your products. So, I think there is an imperative on the economic side to make sure these products can work for everyone.

Brian: That’s a great example. And my next question here, which I’m going to tee up, I’m curious if you see this as black and white, or gray. So, let me show you have an example from even my own work and my own business. So, one of my favorite thought leaders, just in general that I love is Seth Godin, and in the framing of marketing, he talks about privacy as—your customers and prospects, they don’t want to be surprised.

Debbie: Right.

Brian: And I think that framing is fascinating because it says there’s a line between being surprised—like, “I didn’t expect that,” to not surprised. Is that the right framing to you? And this is a qualitative thing, but do you think we can get more specific to that? And here’s a very simple example: if I had a shoe store, and you came into my shoe store, and you’re checking out a pair of heels, and you come by three times in the same week and you go right to those orange heels on the second shelf, I would probably as a store clerk go up to you and say, “Hey, Debbie. Would you like me to see if I have a size available? Do you have any questions about these heels? I’m happy to help you out however I can.”

I would come up to you because I know you have some intent, you’re obviously interested in that. In the digital space, if someone visits my—and this is literally true, and I implemented this recently to try this out—if you visit my seminar page a few times within a certain timeframe, it’ll send you a touch, an email, it’ll just say, “Hey, do you have questions about this seminar? If you’d like to hop on a short call, and I’m happy to answer any questions about how it works.” The designer in me felt so slimy doing that. The business person in me, the store clerk in my shoe store was like, “This person’s obviously interested. They have questions, something here”—maybe they’re not interested in buying, and that’s fine. I’m not trying to sell somebody something, but is that service, or is that stalking? And is this line black and white to you as a privacy expert, or is this a gray line that we all have to interpret about whether that’s right or not?

Debbie: Yeah. I think… there, I don’t know. I guess the parallel I would draw would be harassment, right? So, I think harassment is, kind of, a repeated thing. Not just, you know… like you said, the one to—you know, someone is interested and you do the one-touch, I don’t think anyone—well, most people, probably—wouldn’t be upset with that one touch.

But let’s say you didn’t hear from that person for a while, and then email them, like, five or six more times, or whatever, that may be a problem. So, in some ways, it's a gray area. And I guess the thing that makes it hard is that the way that computers are made, or the way systems are made, you automate so much of this. And some of that—you know, I try not to do too much automation on stuff like that because, like you said, there’s a slimy factor, where maybe as a human human, I may not have emailed someone three times [laugh] about something, where if you’re in a pattern where it’s sort of automated—if this person does this, then you do that—it just may feel a little bit different.

Brian: Yeah. Yeah, there’s—

Debbie: Especially as a—

Brian: Yeah, please.

Debbie: Yeah. I think you just have to balance that. For me, like, on LinkedIn, I don’t do any sales things, or patterns, or whatever because the majority of people that I connect with on LinkedIn, I either know them, or they know me in some way, and I think they’d be upset if I sent them three emails that they’re—you know, something crazy like that. Maybe it’s different depending on a product that you sell, or to a person that you don’t know, or something like that. So, that’s your thing that you have to really think about.

Brian: I totally agree. I mean, that’s now deliberate behavioral manipulation. The cart abandonment stuff, and it’s like, “It’s still there. Are you sure you don’t want it?” All this kind of stuff.

I get that it probably converts better, and that’s why they do it. I guess for me, the question was doing it the first time implies that the behavior is being tracked, that something is not entirely private, and is that a gray area to you that companies should be thinking about? Or is this a black and white area, and it’s, there’s rights and wrong? It sounds to me like you’re saying it is, kind of, a choice. It’s not a legal issue, it’s not a privacy issue. I don’t know. What’s your take on that? Is being—

Debbie: Yeah.

Brian: —surprised, is that the right way to frame it, the way Seth Godin talks about it?

Debbie: I guess. I guess, what he’s trying to say, maybe, is that people want to be informed. People want to make an informed decision. So, being transparent with them will make them feel better because they feel like, “Okay, I know this is what I’m dealing with,” or whatever. And then I’m not going to be upset because I’m not surprised when I’m going to get the abandoned cart email or something like that.

So, but yeah, I think that’s probably a good way to put it. Transparency is the wave of the future, I think because so many laws have that. Almost any privacy law you think of has transparency in it, some way, shape, or form. So, if you’re not being—or trying to be transparent with the people that you’re dealing with, or potential customers, you’re going to end up in trouble. [laugh].

Brian: Let’s talk about lawyers. Everyone’s favorite—

Debbie: Mm-hm. [laugh].

Brian: Subject. [laugh]. In my experience, working with lawyers in the product design space, in the digital product design space, it was heaviest when I worked at a financial institution, a really big one. And I watched how the legal and risk department was basically—they crippled stuff constantly. And I say“ cripple it” because the feeling that I got was there’s a line between adhering to the law and then also—some of this is a gray area, like disclosure, and is this technically advice?

If we show this chart that has this information, is that construed as advice? I understand there’s a lot of legal regulation there. My feeling was, there’s got to be a better way for compliance departments and lawyers that genuinely want to do the right thing in their work to understand how to work with product design, digital design teams, especially ones using data in interesting ways. Do you have a prescription for how to work with compliance and legal when we’re designing digital products that use data so that it’s a team effort, and it’s not just like, “I’m going to cover every last edge because that’s what I’m here to do is to stop anything that could potentially get us sued.” There is a cost to that.

There’s an innovation cost to that. It’s easier, though, to look at the lawyer and say, “Well, I guess they know the law better, so they’re always going to win that argument.” I think there’s a potential risk there. Do you have a take on this, like, how we should work with our compliance partners?

Debbie: So, I actually have worked with lawyers for over 30 years, very closely. So, I understand exactly what you mean, and especially in the—you know, as I work at the intersection of law and technology, this is an issue that comes up quite a lot. Part of it is really respecting the data folks as professionals just as the lawyers are respected as professionals. So, I think understanding that that collaboration has to happen and that relationship has to be built, in a way, I think it’s really important. What I see occasionally, which troubles me, is… we have someone, they understand the law, but not the technology.

You don’t want them to drop a grenade and leave the room. And then it’s like, “Okay. You guys figure it out from here.” I’ll give you an example. So, I was working with someone in Europe about customer records. And this particular lawyer was of the opinion that what—so you know databases, when a customer record, that—every customer has a number, right?

That’s the key in the database for how you pull up that information. So, this lawyer was saying, “Well, I think that this customer number is personally identifiable information, so you have to find some way to mask it in the database that you use to pull up the customer record.” So, as a database person, we know this is not feasible. This is not—

Brian: Primary key is—[laugh]—

Debbie: —possible. And it’s sort of an overreach for someone who doesn’t understand databases or how databases work to actually say that. But on the flip side, if you feel that that is the way—if you feel like that is personal data, there are ways, technologically, to satisfy that without demanding that someone mask that the actual record within the database that people are using to pull up customer records. Because if you did that, you would be out of business because you wouldn’t be able to pull up your customers, right? So, being able to find a way, whether that be maybe a different field or a different number that gets masked some way, like, on the customer side is different.

So, I think being able to talk through those issues and make sure—there’s a line, I think, that sometimes people go overboard. And I think part of that is… I’ve had people tell me, “I have iPhone. I download songs on my thing. I have a computer, so I know computer stuff.” And I don’t think that’s enough to be able to usurp the experience and skill of people who know what they’re doing with technology and data.

Brian: Yeah, yeah. That’s a great example. I wanted to ask you about this project that I’m aware of around individuals monetizing their data. We might have touched on this in the pre-call, but I’d just love you to, kind of, as [Shane Paris 00:29:26] would say, double-click on this for me and open your mind to this. So, I advise for MIT’s Sandbox Innovation Fund, and once a month or so, I get to talk to students that have questions about the area that I specialize in.

And one of them had a fascinating product that they were working on, and this was essentially a bank that instead of storing your money, the bank stores your data. And when you log into the bank, you get offers. You know, “Google would like to use your location for $4 a week. Do you consent to that?” For example, Google or whoever it is, right?

And so they’re directly trying to allow users to make choices about where’s my information going, and I want to get paid for that. This seems to me like a really fascinating way to empower users to monetize themselves. I also wonder if it’s, A) ripe for potential fraud and hacker, like, creating all kinds of fake profiles, that’s one thing, but secondarily is, people that need money more than they need privacy, especially in the short term, and they start accepting everything because they just need money and they don’t understand the ramifications of giving every single software company, basically full access—I accept every offer no matter what it is because I really—I don’t have a lot of money. Can you talk to me about what is your head thing when you’re—like, what does Debbie think when she hears an idea like this?

Debbie: Well, first of all, I think data monetization is the next question. So, regardless of whether people—how they feel about it, this is going to come up in some way, shape or form because people are getting more savvy about their information. And I think that they’re going to share more data with less companies. So, if you’re not one of those companies that are on inside that people to share with, you’re going to have to compete to try to get that person’s data. And a lot of that will be, probably, offers to see if you could get data from some individuals.

But I think you’re right about—you know, it’s kind of a caste system, I think, that’s being created, where people who can afford to not share their information or don’t need the money, will have more privacy than others. And in a way, this is already happening. So, this is happening already in things like cell phone service. So, if you go, let’s say you go to one of the discount carriers, with a certain package, or whatever, where it’s really inexpensive, a lot of those phones have all types of tracking on it. So, the higher up your phone goes, the less tracking that you have because you’re paying more for that phone and for that service.

So, this is already happening, I think. It just isn’t transparent to people that it’s happening that way. So, I think I like the idea of people owning their data and having it as a bank. For me, I would rather—I want to know how much my data is worth; it’s obviously worth a lot to some people for whatever reason. But I don’t think that there’s anything we could do to stop this from happening. It’s going to happen because companies are going to fight now to get data from individuals, and if they can’t get it freely, they’re going to have to find a way to be able to get that data from people.

Brian: Yeah. Yeah. Debbie, this has been a great conversation. I’m curious, just because I don’t know this space super well, are there any questions I should have asked you that I didn’t, or anything you would like to close with?

Debbie: I love your questions. I love—

Brian: [laugh]. Oh, thank you.

Debbie: —I love data folks. So, data folks are my favorite because you understand the intricacies of things, things that a lot of us take for granted about how things work in the world. It’s great that we can have a phone or iPad, we can press buttons and stuff, but it took a lot of thinking and work for someone to develop it so that it’s easy for you. So, I understand that mindset and I know the work that goes into that. So, Bravo to you.

And I like the way that you’re educating. I love your technical questions because you’re right, I think, in my opinion, when people think about privacy, for some reason they think about lawyers, which I don’t understand because I feel like privacy is a data problem that has legal ramifications. It’s not a legal problem that has data ramifications.

Brian: Yeah.

Debbie: So, [laugh], I feel like we’re thinking about it the wrong way. And I think people who design products can reduce the risk tremendously from companies or for products, or reduce the barriers to adoption tremendously, if you’re thinking about these things in the design area because you won’t have the same problems that other people have at the end of the tunnel, say.

Brian: Sure. And I mean, trust is so important, not just with—a lot of times in our space, we think about it with machine learning, and AI, and trusting the model predictions and all this kind of stuff, but trust is a brand attribute as well and it’s part of the reason I think design is important because the designers tend to be the most empathetic and user-centered of the bunch. That’s what we’re often there to do is to keep that part in check because the technology almost allows us to—I mean, we can do almost anything these days with the tech and the data, and some of it’s like, “Should we do this?” And if we do do it, how do we do it so we’re on brand, and the trust is built, and all these other factors go into that user experience. So, great content. I really enjoyed your ideas and thanks for coming on the show to share them with us.

Debbie: Thank you. I really appreciate it.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe for Podcast Updates

Join my DFA Insights mailing list to get weekly insights on creating human-centered data products, special offers on my training courses and seminars, and one-page briefs about each new episode of #ExperiencingData.