For Danielle Crop, the Chief Data Officer of Albertsons, to draw distinctions between “digital” and “data” only limits the ability of an organization to create useful products. One of the reasons I asked Danielle on the show is due to her background as a CDO and former SVP of digital at AMEX, where she also managed product and design groups. My theory is that data leaders who have been exposed to the worlds of software product and UX design are prone to approach their data product work differently, and so that’s what we dug into this episode. It didn’t take long for Danielle to share how she pushes her data science team to collaborate with business product managers for a “cross-functional, collaborative” end result. This also means getting the team to understand what their models are personalizing, and how customers experience the data products they use. In short, for her, it is about getting the data team to focus on “outcomes” vs “outputs.”
Scaling some of the data science and ML modeling work at Albertsons is a big challenge, and we talked about one of the big use cases she is trying to enable for customers, as well as one “real-life” non-digital experience that her team’s data science efforts are behind.
The big takeaway for me here was hearing how a CDO like Danielle is really putting customer experience and the company’s brand at the center of their data product work, as opposed solely focusing on ML model development, dashboard/BI creation, and seeing data as a raw ingredient that lives in a vacuum isolated from people.
In this episode, we cover:
- Danielle’s take on the “D” in CDO: is the distinction between “digital” and “data” even relevant, especially for a food and drug retailer? (01:25)
- The role of data product management and design in her org and how UX (i.e. shopper experience) is influenced by and considered in her team’s data science work (06:05)
- How Danielle’s team thinks about “customers” particularly in the context of internal stakeholders vs. grocery shoppers (10:20)
- Danielle’s current and future plans for bringing her data team into stores to better understand shoppers and customers (11:11)
- How Danielle’s data team works with the digital shopper experience team (12:02)
- “Outputs” versus “Outcomes” for product managers, data science teams, and data products (16:30)
- Building customer loyalty, in-store personalization, and long term brand interaction with data science at Albertsons (20:40)
- How Danielle and her team at Albertsons measure the success of their data products (24:04)
- Finding the problems, building the solutions, and connecting the data to the non-technical side of the company (29:11)
Quotes from Today’s Episode
- “Data always comes from somewhere, right? It always has a source. And in our modern world, most of that source is some sort of digital software. So, to distinguish your data from its source is not very smart as a data scientist. You need to understand your data very well, where it came from, how it was developed, and software is a massive source of data. [As a CDO], I think it’s not important to distinguish between [data and digital]. It is important to distinguish between roles and responsibilities, you need different skills for these different areas, but to create an artificial silo between them doesn’t make a whole lot of sense to me.”- Danielle (03:00)
- “Product managers need to understand what the customer wants, what the business needs, how to pass that along to data scientists and data scientists, and to understand how that’s affecting business outcomes. That’s how I see this all working. And it depends on what type of models they’re customizing and building, right? Are they building personalization models that are going to be a digital asset? Are they building automation models that will go directly to some sort of operational activity in the store? What are they trying to solve?” - Danielle (06:30)
- “In a company that sells products—groceries—to individuals, personalization is a huge opportunity. How do we make that experience, both in-digital and in-store, more relevant to the customer, more sticky and build loyalty with those customers? That’s the core problem, but underneath that is you got to build a lot of models that help personalize that experience. When you start talking about building a lot of different models, you need scale.” - Danielle (9:24)
- “[Customer interaction in the store] is a true big data problem, right, because you need to use the WiFi devices, et cetera. that you have in store that are pinging the devices at all times, and it’s a massive amount of data. Trying to weed through that and find the important signals that help us to actually drive that type of personalized experience is challenging. No one’s gotten there yet. I hope that we’ll be the first.” - Danielle (19:50)
- “I can imagine a checkout clerk who doesn’t want to talk to the customer, despite a data-driven suggestion appearing on the clerk’s monitor as to how to personalize a given customer interaction. The recommendation suggested to the clerk may be ‘accurate from a data science point of view, but if the clerk doesn’t actually act on it, then the data product didn’t provide any value. When I train people in my seminar, I try to get them thinking about that last mile. It may not be data science work, and maybe you have a big enough org where that clerk/customer experience is someone else’s responsibility, but being aware that this is a fault point and having a cross-team perspective is key.” - Brian @rhythmspice (24:50)
- “We’re going through a moment in time in which trust in data is shaky. What I’d like people to understand and know on a broader philosophical level, is that in order to be able to understand data and use it to make decisions, you have to know its source. You have to understand its source. You have to understand the incentives around that source of data….you have to look at the data from the perspective of what it means and what the incentives were for creating it, and then analyze it, and then give an output. And fortunately, most statisticians, most data scientists, most people in most fields that I know, are incredibly motivated to be ethical and accurate in the information that they’re putting out.” - Danielle (34:15)
- LinkedIn: www.linkedin.com/in/danielle-c-b3565b17/
Brian: Welcome back to Experiencing Data. This is Brian T. O’Neill and today I have Danielle Crop on the line from Albertsons. Danielle, how are you?
Danielle: I’m good. How are you, Brian?
Brian: I’m doing great. And I should just say Albertsons was a family grocery store for me growing up in Phoenix, Arizona. So, I know we have some international listeners who may not know this chain. Actually, we were just talking about this. It’s Albertsons Companies, which I didn’t know. What am I leaving out if it’s not just groceries?
Danielle: Well, it is mostly grocery, but it is a bunch of different banners all over the country. So, for example, Safeway, Vons, Haggens, Jewel Osco, Star, Shaw, a ton of different banners for local grocery stores all over the country.
Brian: Got it. Got it. So, I’m sure some of those brands, I know those East Coast brands as well, et cetera. So. But yeah, so why are you here and there’s a reason I invited you on the show today.
One of the things that is a trigger for me when I’m researching guests and things like this is when I see someone in a data products leadership position who has had some interfacing with product design, product management, these kinds of things, maybe comes out of quote, a more “Digital,” a “Pure digital” background, but they’re working in a data space. And so I saw that you had manage some teams, I think it was at AmEx in your previous role, and maybe prior to that, and so I usually feel like there’s a different perspective that comes with that, and so that’s what I wanted to dig in today. And maybe there isn’t a different one. I don’t know. That’s what we’re going to jump in here.
First thing I wanted to kind of throw at you as a question here is this whole digital versus data distinction relevant to you? And I think it is generally not particularly relevant when we’re talking about developing data products that have humans in the loop. Whether we’re using pre-built tooling or making custom applications or dashboards or whatever, if we’re talking about making things that humans are still going to be in the loop to make decisions with, we’re still sort of talking about software here. It’s just there might be a heavier data component to that work. And some of the data crowd will say, “No, no, no. That’s software, that’s a different thing. Data science is not that, analytics is not that. That’s a separate thing.” Is this distinction even important to you? I think is. [laugh]. What is your take on this whole divide here?
Danielle: I think that the distinction is somewhat irrelevant. And here’s why I would say that is that data always comes from somewhere, right? It always has a source. And in our modern world, most of that source is some sort of digital software. And so to distinguish your data from its source is not very smart as a data scientist, right?
You need to understand your data very, very well, so where it came from, how it was developed. And software is a massive source of data, right? So, I think it’s not important to distinguish between these things. It is important to distinguish between roles and responsibilities, you need different skills for these different areas, but to create an artificial silo between them doesn’t make a whole lot of sense to me.
Brian: So, I think part of where this comes into play is sometimes this idea—and this is probably more true, particularly in the data science areas, but this idea of operationalization of models and solutions as being a very distinct step that happens with a bunch of other people and has nothing to do with those of us who are actually working on the model. And so this idea that that’s different, it’s like, “Well, we don’t get involved with how the model is used. It just sits in GitHub and it’s someone else’s job to then go, quote, ‘deploy it’ to somebody.” I tend to find this approach is not going to lead to models that actually get used a lot of the time. It’s too decoupled from the humans that will use this downstream, let alone the collection phase and all the issues that could go into where’s it coming in from? Is it coming into the right format? Why is it coming into the—like there’s a whole input side right?
So, talk to me a little bit about this. Do you involve your data scientists and work that will directly impact the end-users if they’re going to be making decisions, for example, with this information, like, say someone who’s deciding how much will the carrots cost this week in this region? Like, how did we decide what price to put on there? Someone had to make a decision to print the stickers out that said this, right? And maybe that’s fully automated at this point, I don’t know, but I would think there’s probably still here and touchpoints involved with this. Talk me through this. Like, how do you approach this?
Danielle: The team today, I’m actually pushing them—they’re very, very bright data scientists that come from, you know, academia, and they’re really great; they know their stuff, but I am pushing them to understand exactly what they’re personalizing, automating, customizing. So, for example, a presentation that we’re putting together for my leadership came to me with just, “Here’s all the models we built,” without reference to what they powered. So, I made the team go back to the table and actually do screenshots of everything they personalized so that they can actually know exactly where their models what their models were personalizing and what interactions that happened with the customer and with the person for those models. I think it’s very important that they know where their data came from—toward the last question—but also where their data is going, and what it’s powering. You can’t optimize things you don’t understand, right?
Brian: So, it sounds like that happened later, after the first swing at it. Is there a routine behavior that you try to encourage in terms of how the data scientists and analysts are going to interface with actual people earlier in the phase of model development, their dashboard development, whatever the tooling is that’s being created, is that something that you guys get involved in pretty heavily upstream?
Danielle: Yeah. I mean, this is all a very cross-functional, collaborative exercise. Data scientists need to be working with product managers. Product managers need to be understanding what the customer wants, right, and what the business needs, and passing that along to data scientists and data scientists need to understand how that’s affecting business outcomes. So, I think that’s how I see this all working.
And it depends on what type of models they’re customizing and building, right? Like, are they building personalization models that are going to a digital asset? Are they building automation models that will go directly to some sort of operational activity in the store? What are they trying to solve?
Brian: Got it. What’s a product manager in the context of a company that’s not building—I assume you’re not building SaaS tools that you’re then selling on the back end. Maybe you are, I don’t know, but what does a product manager mean, in this kind of internal enterprise business?
Danielle: It’s the same thing. It’s just software product management for internal business outcomes instead of for selling right to third parties. It’s still the same role; you’re still working on improving that product and making it useful for, in this case, many cases for the customer, right, the actual Albertsons customer. But it’s really the same role, it’s just slightly different outcome than say, you know, somebody who’s selling software.
Brian: You say this very matter of factly like it’s pretty normal. I find that this is—and for me, I have a very similar viewpoint on that—I do not find that this is a normal thing. So, is this carried over because of previous work that you’ve done and the value that you saw in having that role? Was this foreign to bring this in, or were you just kind of giving a green—do they even report to you? Maybe we should start there. Do these data product managers—or maybe that’s the wrong term—do they even report into your organization?
Danielle: They do.
Danielle: So, I have data management, data governance, data product managers, and data scientists.
Brian: Got it. Got it.
Danielle: And then they work with what we, you know, lovingly called the front-end product managers, right, that are basically the actual digital assets. They’re managing that aspect of it, and we manage what helps power that through the data management data products and data science.
Brian: Was that a new thing at Albertsons or was that an existing structure?
Danielle: It was brand new.
Brian: And what was that like, bring that in?
Danielle: It’s always… challenging to bring in a new paradigm, but at the same time, it was very well accepted because it was clearly needed. The company wanted to do this transformation and understood that data was at the center of that transformation and understood that having an organization with the right skill and people to drive that was important. But yeah, before I came in, the term ‘data products’ or ‘data product manager,’ was not a thing at Albertsons.
Brian: [crosstalk 00:09:06] you said, “It was clearly needed,” so what was the presenting problem such that bringing in product management, this even concept made sense to whoever it is that you needed to convince or support this? What was the problem?
Danielle: I mean, the biggest one, right, is in a company that sells products—groceries—to individuals, personalization is a huge opportunity, right? How do we make that experience, both in-digital and in-store, more relevant to the customer and more sticky and, you know, builds loyalty with those customers? And so that’s the core problem, but underneath that is you got to build a lot of models that help personalize that experience. When you start talking about building a lot of different models, you need scale, right? When you need scale, you need products that scale that can do that.
And that’s the difference between just, “Okay, I’m going to personalize, you know, two or three placements,” and I can do that, you know forgive my language with duct tape and paperclips versus I’ve got hundreds of models that are helping people decide what to buy every day, all the time, and I need those to scale in real-time. And when you talk about scale in real-time, you have to build robust data products.
Brian: Right. Tell me about the model you have or maybe even the language that you use for your team in terms of who a customer is. Is a customer a front-end product manager, or is the customer someone that buys groceries? How does your team frame this conversation? Because we talk about, quote, ‘internal customers,’ some people I’ve had on my show, really dislike this terminology that a customer would ever be anyone besides someone who’s paying for something at the very far end. Other people don’t have an issue with it. But how do you talk about this?
Danielle: I think it’s ultimately everything that we do, whether it’s collaborative with other stakeholders internally or whether it’s something we build, you know, directly and independently, is ultimately for the customer and/or the shareholder. So, to me, there’s no distinction. It’s all about, like, collaboration to make a difference to the customer and the shareholder.
Brian: Got it. And so do you have your data teams interfacing with these frontline product managers at all? Like, for example, are they ever—have they ever been to a store before [laugh] and watched someone buy groceries? Like, how integrated is your team with the people who actually buy food and toilet paper? [laugh].
Danielle: We’re starting that, actually. So, some of the folks have been into the stores—I’m ten months into role at this point, so we’re building a lot of stuff—but I’m passionate about my team being in the stores or, you know, working on the app and seeing people’s experiences and understanding the problems that the frontline employees or our customers face because I think that makes a huge difference to how you build and design products and models if you understand the problem that you’re ultimately trying to solve, in a very, you know, empathetic and direct way.
Brian: What’s the attitude, like, or the response to this? Do you hire people on the data sides [unintelligible 00:12:07] on the technical side with an understanding that this will be part of the work that we want you to do? Is this something you have to train? I find it’s not a natural thing for a lot of data people to want to go spend time doing that. It’s not natural unless you’re specifically hiring for people who have a different skill set in addition to their technology background or data—you know, statistics, or whatever it is. Talk to me a little bit about the culture of doing this work and how you see it.
Danielle: I think it’s just really important to have that expectation as a leader, right, of your people. And then they will rise, right? Even if they come in, and they’re not necessarily of that bent or background, once they actually understand why this is valuable to them, get them in the store for a little while, get them talking to people, helping them to understand where their data comes from and then they will see the value of it. They don’t necessarily, you know, come out of school that way, right? Because in school, your data sets are generally packaged for you.
Like, you get them in this, like, very clean way. When you leave, that’s, you know, that’s just not what the real world is like and so you kind of have to help them learn those skills because they don’t learn them in school. But I do think that once you get your team on that road, they value it and understand it very clearly. It just, maybe it’s that first step of enabling them, you know, enabling the relationship, getting them in the store. Like, you as a leader, you have to facilitate that.
Brian: Yeah. Yeah. Do you have any trained or professional either user experience people, researchers, designers who were involved in facilitating this or doing this with your data scientists? How does design or research fit in at all with that in a formal process, or is it very informal?
Danielle: Right now, it’s pretty informal. It will become more formal as we move along. So, my colleague and I—my colleague who owns the shopper experience of digital assets—and then myself, we work very closely together and our teams work very closely together. And the expectation is that these teams are partnering very closely what outcomes they’re trying to drive. So we’re, for lack of a better term, we’re podded, right?
So, this pod is dedicated to this pod and they work together to deliver on the outcome. So, [unintelligible 00:14:22] a certain area of placement, say like, you know, substitutions of products, there’s a team on her side that designs the substitution experience, both from a product management and a UX design point of view. And then you add the data scientist, right? So, it’s basically that collaborative activity together, working together with the technologies organization to drive that change.
Brian: This podding concept, this exists now, or that’s where you’re going?
Danielle: It exists now. And we’re, like, continue to scale it out. You know, adoption of basically kind of a scaled Agile format started last year.
Brian: A lot of data scientists say they don’t like Agile, it doesn’t work for what we do. What’s your take on that? [laugh]. Not that I want to get way off on Agile on this show, but I’m just curious. I hear both sides of this.
Danielle: I think that there’s some truth to that. The truth to that is that in a lot of places, data scientists are not given the tools or the capabilities to be Agile, right? So, they’re not—like, if you’re dealing with having to pull data out of a mainframe environment still—and there are some places that are still like that—it doesn’t lend itself to Agile very well, right? But I do think that there is aspects of—once—if you get them on the cloud, you get them in a way where they can get access to data quickly, they can build models, and then they should iterate on those models. And that is Agile.
Brian: So, it’s more of the technical impediments to doing data science work that’s still in the way of Agile; it’s not so much that data science itself—if you’re working with—if you have data to work with, which is kind of that boilerplate starting place, then it’s possible.
Danielle: Yeah. And models need to be thought of as products, right—
Brian: [crosstalk 00:16:04]—
Danielle: —because then you continue to iterate and evolve on those models and add new data, right, over time. Whereas, you know, I think traditionally data scientists have thought, “I finished that model. It’s in production. I’m done.” Right? So, then they don’t think of it as being Agile, right? Because they don’t think of it as a product that they own, that they’re going to continue to innovate on.
Brian: Yeah. I mean, the other risk with that, too, is thinking of product in that sense can also be looked at it as this was an output; it was a thing that was created and then now it’s gone, and that’s different than an outcome that may need to change, increase, whatever, over time and owning the outcome instead of owning the output, is this a challenge at all? Like, and how do you guys address this kind of outputs versus outcomes perspective?
Danielle: I think it’s a challenge for everyone. The Agile mindset can be a challenge because sometimes you just want to claim victory and move on, [laugh] right, like, just from a human perspective. But I think the lovely thing about product is you never do that. You’re always looking at it, and that’s why you need product owners. And in the case of data scientists, their product is their model and they own it.
Brian: Yeah. What do you look for? When you’re hiring a product manager in this space—a data product manager, I should say?
Danielle: So, I look for three things in anyone: Kind, smart, and curious.
Brian: In that order?
Danielle: Not necessarily, but I look for those three things. And I think that they’re all three, important. It’s not one over the other, right? It’s all three because I feel like if you’re kind, you’re smart, and you’re curious, you can learn and do anything.
Brian: Yeah, I like that. I like those adjectives. Tell us a little bit about where these data products end up manifesting themselves—and I guess we’ll focus on the shopper’s experience. But how does machine learning analytics, whether it’s for internal stakeholders, or the shopper, the buyer? Where am I experiencing that, either in the mail or when I walk into a store, where am I experiencing your team’s work?
Danielle: A good example, I gave earlier is substitutions. So, if we are out of stock on an item that you prefer, then we will use your buying purchasing history to determine what would be a good substitution for you.
Brian: We’re talking about online shopping, I assume here?
Danielle: [unintelligible 00:18:19] talking about online shopping. Most of our personalization at this point in time is in the digital assets.
Brian: Is there an example of a real-world personalization?
Danielle: There is. There’s a couple that I really like. They’re not in production yet, but hopefully, we’ll get there. One is relatively simple and the other one is pretty darn hard. One is basically sending to a point of sale, right, to an associate who’s actually at the checkout, that the person who’s checking out now, we haven’t seen them in the store for 91 days.
So, being able to tell them, “Welcome back, it’s good to see you again,” or something along those lines. They can use personalization to create a human experience, which I think is great. That’s a relatively simple one. I mean, the data can sometimes be a little tricky, but it’s a simple one.
Another one that’s really challenging is things like, if you’re like me, you carry your phone around in the store, right? Like, I’ve got my phone in my hand; it’s got my grocery list on it, right? I’m carrying it around. What I would love is for somebody to push to me what I bought the last time I was in the produce aisle so that I don’t buy that extra onion, right, that I don’t need. Or I like oh, look, I didn’t—I thought I had that, but I don’t. But I really didn’t buy that last time, so I better buy it this time.
Brian: Oh, remind me what I—so I don’t buy extra. Not so much that, like, “I like carrots. Oh, I forgot.” It’s more the quantity of—
Danielle: Either way, it can work either way, right?
Brian: Yeah, yeah. Okay.
Danielle: And so that’s what I love about it. But that is a challenge, simply because this is a true big data problem, right, because you need to use the WiFi devices, et cetera. that you have in store that are pinging the devices at all times and it’s a massive amount of data. So, trying to kind of weed through that and find the important signals that help us to actually drive that type of personalized experience is challenging. No one’s gotten there yet. I hope that we’ll be the first.
Brian: Let me play devil’s advocate for a second. “So, Danielle, you want to build something that will help our shoppers buy less stuff because you’re going to remind them that they have four onions at home? Is that what you’re saying?”
Danielle: Or buy more stuff because they realize that they didn’t buy that celery last time and I do need that, right? So, it works both directions, but the most important thing that we’re trying to accomplish at Albertsons is really to create that relationship with our customers, right? Customers are, what matters, and having loyal customers for the long-term and building that long—I mean, there’s very few relationships that businesses have with people that are more personal than grocery, right? People are in the store two to three times a week buying what they need to feed themselves and their family. Other than schools and churches, there’s nothing that’s really more consistent or common. So, building that relationship, that long-term, long-standing relationship with our customers is very important to us. That’s what I think is going to help us serve them and then help us win as a company and as a brand.
Brian: I have to be totally honest. I can’t say that I’ve ever felt like I have any type of relationship that’s non-transactional with a grocery store, to date, anywhere. I mean, I live actually live across the street from a market, a small, you know, it’s probably considered a small-sized grocer, you know, nursery, really nice products and stuff. I’ve lived here 15 years, I don’t think a single person there knows my name. I’m in there almost daily in the summertime, probably twice a day, spent thousands of dollars [laugh] there and it’s like, they have no idea who you are.
I think it’s the right place to do the kinds of things you’re talking about. But tell me—I mean, this is a great example for listeners. So, the first example you gave about in-store personalization about giving maybe a piece of dialog or a sentence starter, a conversation starter to the checkout person, that is a user experience change. That is something that has a long-term value, but it can sound really squishy and I could very much see some data scientists that I met being, like, “This is my project? Like, ultimately, you want to display some text that says ‘Welcome back?’” or like, “Really? Like, what do we get for that? That doesn’t sound very—that sounds, like, you gave me the crappy project, Danielle.”
Help me unpack how you talk about this with a team because you’re really talking about something that’s a brand thing, it’s a long-term experience thing that changes over time; I could see that as being a hard sell with—maybe hard to measure what the value of that is in one month. Talk to me a little bit about how you approach those projects, get them prioritized, make something like that important?
Danielle: Well, I think that it’s a simple one. So, of course, I mean, from a data science sophistication, like, of course, the data scientist is going to be like, “Oh, this”—but it’s a good thing it’s a simple one. You can do it relatively quickly. So, there’s not really an issue with that. And I think everybody understands when you say these use cases, everybody understands that, like, oh, as a human being, I would like that, right?
Brian: Yeah, yeah.
Danielle: And I think that getting that prioritized is not difficult. I already have divisions that are like, “I want that and I want that now.”
Danielle: So, that’s [laugh] that’s not a challenge. Getting the data from where it is to where it needs to be in the right time—which goes back to that data products conversation we had earlier, scale, speed—that’s the challenge. So, building those capabilities, that’s where we need to start, and then we can enable these things. But from a data science perspective, yeah, I mean, it’s not going to be the sexiest thing they do all year, but it’s valuable. And then we can test it. So, we can test stores against each other—ones that have it, ones that don’t—was there a sales uptick at all? Did we have—let’s just say it’s a brand new customer that we haven’t seen before, right, and we engage them in that way, then did we retain that customer more effectively?
Brian: Yeah. How do you go through the process of—and I’m curious if your teams are involved in this or not about how we’re going to measure success for some initiative like this one that you just talked about here? When is that decided? Who’s involved with deciding how we would measure whether or not the greeting personalization project did something?
Danielle: I think that it the data scientists are the ones who are responsible for determining how they’re going to measure success. And that’s the expectation that I have is that everything that they do, when they present it to me, they have told me exactly how they were going to measure success on it.
Brian: Mm-hm. Is that something you could share a little bit about how this particular way you would do that? Like, for example, my designer lens, when I think about this here is, I can see all the number crunching that could be involved in something like this. On the other end, I can also see why, huh, me as a checkout person, I don’t want to say this to Danielle when she’s in the aisle, or I don’t care. All these things could fail right at that moment where the data science work was correct, you did spit the right sentence out on the monitor for the checkout person, but they didn’t want to say it or something else came up.
And so when I train people in my seminar and stuff, these are the kinds of thinking about is that last mile point. And that may not be data science work, and fine, maybe you have a big enough org where that is someone else’s responsibility, a UX person or a product manager or whatever, but being aware that is a fault point here and having a cross-team discussion. Because maybe there is a way that data science could contribute something. Maybe the sentence could be customized better. I don’t know what it is. But knowing that, can you talk a little bit about that? Like, how do you look at that to make sure that the value actually got there?
Danielle: Yeah, I mean, the store operations team has a role to play. If we deliver this to them, then they need to use it. And we won’t have control over that; that’s up to the store operations team. I think that it’s one of those things where you have to when you’re doing something like this and you’re collaborating across teams, everybody has the same goal and you just have to assume good intent, right? When you measure it, assume that they did what you asked them to do.
Brian: You’re talking about the clerk?
Danielle: And yes. And if you compare it to a store where you have this in place versus a store where you don’t have it in place in the same general geography, the same type of demographics, is that you have to design the test properly. But once you design the test properly and then you do the comparison, you have to trust it.
Brian: Talks to me, though, about when and how would you get, say the store operations team, someone, maybe that’s not living in digital all the time, when did they get involved with a project like that? Because they’re linchpin in this.
Danielle: They would get involved in—I wouldn’t even do it without the divisions saying that they were on board with it, right? So, I already have the division saying, “Build this for me. I want it.” Right? So, I know that I can go to the division presidents and say, “We’re gearing up to do this. Which stores would you prefer”—you know, like, let’s talk about which stores we’re going to test in.
And then we go down to the teams, the team would then kind of would work in coordination. Once they know what store, they’d work with the store director and work out how to do this, right, and do the pilot. And then we’d scale it if it was successful.
Brian: Is it hard to get that engagement from those stakehol—like, say, a manager or a shift manager or whatever, is it fairly easy to get them involved in these kinds of digital design discussions, service design?
Danielle: So far, it’s not difficult at all, and in my opinion, it’s their—they own the last mile, right? So, we can give them all the tools we can give them, but the stores are the ones who know their customers the best and so they’re the ones who have to decide, “Okay, I’m going to give you this information that this person is a brand new customer. I’m going to give you this information that this is someone we haven’t seen in, you know, 91 days, whatever. Now, you guys have to decide from a user experience perspective, what does that [unintelligible 00:27:45] going to look like? How are you going to say it?”
I would actually—me personally, just because of my leadership philosophy—I would want the store directors to just say, like, “We’re going to tell you this, guys. How you decide to talk to the customer, it’s up to you.” Because I mean, you don’t know what the conversation was before they got that information or you don’t know whether or not they’re dealing with, like, a little kid or, you know, like, their groceries are rolling o—you don’t—I mean, only the person at the checkout understands what’s going on.
Brian: They’re not robots, like—
Danielle: Yeah. And that would be—
Brian: —and we’ll just say the sentence to everyone that comes, no matter what’s going on. [laugh].
Danielle: And that would be, I think, the magic and the beauty of this.
Brian: Yeah, yeah.
Danielle: Right? And I think that I’ve heard a story similar to this about the way Disney does stuff.
Danielle: They give people the information. You don’t tell them what to say.
Brian: Yeah, yeah. I totally agree with that. I mean, not to mention languages, like, “Oh, you’re speaking Spanish.” I mean, Phoenix, right? Like, if you hear Spanish in the aisle, and you’re bilingual, you switch. And it’s like, and you might talk differently and [laugh] you might say a different phrase, or whatever. The last thing you probably want is a script to be read. [laugh]. It’s the most impersonal personalization you could do. [laugh].
Danielle: [laugh]. That would undermine the entire point. [laugh]. No.
Brian: Yeah. This kind of gets this idea of where do these problems and ideas come from? And so with something like this, does this usually come from the store operations team and they’re requesting the service and then you’re building it? Or how often does your team discover the problems or latent problems that may not be articulated yet, and it’s like, “Oh, wow, we didn’t know we could do that and that does solve an actual need that’s on our roadmap, or part of our business strategy.” Talk to me about who owns the problem and the problem finding and this kind of mindset.
Danielle: So, in this particular case, this problem was defined by operations, our operations teams. You know, a lot of them have a lot of experience, really a wealth of huge experience in this industry. And they come and they say, you know, “Fifteen years ago when we were all—like, before smartphones before all of this when everybody was actually using their cards”—right? They’re old loyalty cards—“We knew more about them than we do now. Right? And we actually, when they swiped that card, we actually knew the last time they shopped with us,” and et cetera, right?
Well, now you know how things more tied to the device than to a card, and so we need to switch it up. So, this is basically making what was old new again, right, because things have changed. So, they came to us and said, “We used to be able to do this. Our people love doing it. Our customers loved it. Can we do it again?”
And that’s where this one came from. But it’s a lot of different ways that it comes, right? Like, so what we like to have our is our business stakeholders to come to us and say, “I have this problem.” I have the problem in the supply chain scenario we’re in right now. They came to us and they said, “We have this problem of, like, we want to make sure that the right supply is going to the stores that actually need it.” Right?
We’ve got this huge supply chain everybody knows, right?
Brian: Right. Yeah. Yeah. Yeah.
Danielle: So, our DCs, our big warehouses, they have to make sure that the stock is going to the stores that are actually needing to sell it and not all of the stores that are ordering a lot because they’re not getting a supply, right? So, the data scientists came in—really quite complicated data science problem—how do we make sure that we intelligently allocate supplies to the stores that actually can sell that supply? And so that problem, our data scientists figured out how to solve that problem with the data we have, but the business came to us with, “This is our problem,” right, and then we figure out, how do we solve that business problem.
Brian: Do you always get a problem, or do you get a solution sometimes is the request? Like, “I would like a machine-learning-based dashboard that will show me where all of my paper products are going in the next month.” I’m just throwing—that’s a bad example, but you understand what I’m saying here.
Danielle: Rarely do we get [crosstalk 00:31:56]—
Brian: You sound very fortunate. [laugh]. There’s a fair number of people that I talk to where they’re getting a solution in the form of a problem. Or they’re getting a problem expressed as a solution that it’s baked in there and so it’s a presenting problem with an assumption about what the solution is that’s required, as opposed to a pure business problem. So, that sounds great that you guys are getting that, you can focus on.
Danielle: Well, you know, like, maybe some of your other folks you talk to her like pure technologies companies.
Danielle: Albertsons is not?
Danielle: Albertsons is a grocer that is powered by technology and data. And so therefore people come to me with business problems, generally, not technologies or data problems. They expect us and my colleagues in the digital space to know how to solve them.
Brian: And given that culture, there are challenges around the adoption piece here? Trust, believability—especially if you’re integrating new technology into an old business—pricing, personalization, whatever, especially when things seem strange. “That doesn’t make sense and I’ve been here for 20 years.” Like, talk to me about that. Or is it not a challenge?
Danielle: No, I think that’s a challenge everywhere, [laugh] right? That’s my impression. But I think that the challenges we face are mostly around the data itself. So, we get into these conversations of like, “Hmm, that data doesn’t seem right.” Those conversations by people who really have been around and know their stuff for a long time.
And it could be that it’s just the transformation, right, moving from this old data source to this new data source. And then we have to reconcile those things. What I’ve been helping my partners with is to understand that whenever you do that, the data is never exactly the same and you shouldn’t expect it to be. So, as a data person, I’m like, as long as the insights and statistical inference you’re making from that data doesn’t change, then there is a data loss or data difference that you can stomach and is okay. So, basically helping people to understand that doesn’t have to be one hundred percent exactly the same when you move from this data source to this data source, as long as your business decisions don’t change.
Brian: Got it. Got it. Cool. Danielle, this has been really great to talk to you about Albertsons and the work you guys are doing there. I did want to ask, is there any questions kind of in this space that we’ve been talking about that I didn’t ask you that I should have?
Danielle: It’s a really good question.
Brian: Or something you would just like to share?
Danielle: I think, you know, what I’d like to share is just that I think everyone is on this huge journey in the data space and in digital in general, and what I’d like to share is that we’re going through a moment in time in which I think trust in data is shaky. [sigh]. And what I’d like people to understand and know, kind of on a broader philosophical level, is that in order to be able to understand data and use it to make decisions, you have to know its source. You have to understand its source. You have to understand the incentives around that source of data.
And what the human aspect—so back to your design, the human aspect of how that data was collected and what were the incentives. That will never change, so—and the reason why I’m bringing this up is I’m having lots of conversations with folks about this, both in personal and professional life these days. You can lie with statistics, we all know it, right, but you have to look at the data from the perspective of what it means and what the incentives were for creating it, and then analyze it, and then give an output. And fortunately, most statisticians, most data scientists, most people in most fields that I know, are incredibly motivated to be ethical and accurate in the information that they’re putting out. So, I guess, like, well beyond Albertsons, I just want people to know that I don’t want this moment to become a situation in which people diverge, and there’s these people who, like, “I’m never going to trust data again.”
And then there’s people, “I’m only going to trust the data and the experts.”
Brian: Yeah. Yeah.
Danielle: Right? I want people to understand that there are good sources of data, there are good people who work with data, who want to come up with good outcomes that helps serve people. And so I guess that’s what I would say.
Brian: Cool. Well, thanks for sharing that. Where can people follow you? And is there anything coming up you’d like to share with the audience? What’s the best way to get in touch?
Danielle: So, I have a couple things coming up. I’m going to be doing some work with Snowflake at [Shoptalk 00:36:28] at the end of this month in Vegas, so I’m going to be talking there. And then I’m also going to be doing a [women leadership event 00:36:40] here in Phoenix on Thursday, March 31st.
Brian: All right. Cool. Yeah, we can link those up. And what’s the best way to be in touch with you? Are you active social media anywhere?
Danielle: LinkedIn is the best way—
Brian: LinkedIn? Okay.
Danielle: —to get in contact with me. Yeah, Danielle Crop. I think I’m the only Danielle Crop, so makes it easy.
Brian: Excellent. Crops, Albertsons, food. It sounds like you’re in the right place.
Danielle: Yeah. [laugh].
Brian: [laugh]. Well, Danielle, thank you for coming. Danielle Crop is the Chief Data Officer of Albertsons Companies. It’s been great to talk to you on Experiencing Data, and I wish you well.
Danielle: Thanks, Brian.