046 – How Steelcase’s Data Science, UX, & Product Teams Are Helping Customers Design Safer Office Workplaces Informed by Covid-19 Recommendations w/ Jorge Lozano

Experiencing Data with Brian O'Neill (Designing for Analytics)
Experiencing Data with Brian T. O'Neill
046 - How Steelcase’s Data Science, UX, & Product Teams Are Helping Customers Design Safer Office Workplaces Informed by Covid-19 Recommendations w/ Jorge Lozano

When you think of Steelcase, their office furniture probably comes to mind. However, Steelcase is much more than just a manufacturer of office equipment. They enable their customers (workplace/workspace designers) to help those designers’ clients create useful, effective, workplaces and offices that are also safe and compliant.

Jorge Lozano is a data science manager at Steelcase and recently participated as a practitioner and guest on an IIA webinar I gave about product design and management being the missing links in many data science and analytics initiatives. I was curious to dig deeper with Jorge about how Steelcase is enabling its customers to adjust workspaces to account for public health guidelines around COVID-19 and employees returning to their physical offices. The data science team was trying to make it easy for its design customers to understand health guidelines around seat density, employee proximity and other relevant metrics so that any workspace designs  could be “checked” against public health guidelines.

Figuring out the what, when, and how to present these health guidelines in a digital experience was a journey that Jorge was willing to share.

We covered:

  • Why the company was struggling to understand how their [office] products came together, and how the data science group tried to help answer this.
  • The digital experience Steelcase is working on to re-shape offices for safe post-pandemic use.
  • How Steelcase is evaluating whether their health and safety recommendations were in fact safe, and making a difference.
  • How Jorge’s team transitioned from delivering “static data science” outputs into providing an enabling capability to the business.
  • What Steelcase did to help dealer designers when engaging with customers, in order to help them explain the health risks associated with their current office layouts and plans.
  • What it was like for Jorge’s team to work with a product manager and UX designer, and how it improved the process  of making the workspace health guidelines useful.

Resources and Links:


Quotes from Today’s Episode

“We really pride ourselves in research-based design” - Jorge

“This [source data from design software] really enabled us to make very specific metrics to understand the current state of the North American office.” - Jorge

“Using the data that we collected, we came up with samples of workstations that are representative of what our customers are more likely to have. We retrofitted them, and then we put the retrofitted desk in the lab that basically simulates the sneeze of a person, or somebody coughing, or somebody kind of spitting a little bit while they're talking, and all of that. And we're collecting some really amazing insights that can quantify the extent to which certain retrofits work in disease transmission.” - Jorge

“I think one of the challenges is that, especially when you're dealing with a software design solution that involves probabilities, someone has to be the line-drawer.” - Brian

“The challenge right now is how to set up a system where we can swarm at things faster, where we're more efficient at understanding the needs and [are able to get] it in the hands of the right people to make those important decisions fast? It's all pointing towards data science as an enabling capability. It's a team sport.” - Jorge


Brian: Welcome back everyone to Experiencing Data. This is Brian O'Neill, and I'm really happy to have one of Steelcase’s data science managers, Jorge Lozano on the show, who I recently met on a webinar that the International Institute for Analytics had asked me to do that covered design and product thinking as kind of the missing gap in analytics and data science work. And so we had Jorge come on to kind of talk as a practitioner on that webinar, and I really enjoyed the conversation. I thought we could go a little deeper on this conversation in the podcast on some of the stuff that we didn't get to in the webinar with IIA. So, Jorge, welcome, Bienvenidos to the show. It's great to have you here. I'm happy to kind of pick up our conversation. How's it going?


Jorge: Brian, thank you so much for the invitation. It's a real pleasure. Very excited to be here and very excited to chat with you and the audience about this topic.


Brian: Great, great. Well, so Steelcase: I immediately think of desks, furniture, chairs, industry—commercial workspace. Is that still the way to think about Steelcase today, and how does your role fit into that space?


Jorge: That's a great question. Given that we're mostly a business to business company, we tend to not be on top of everyone's mind when you think of Steelcase as a brand. We've historically done office furniture, yet our strength lies in really designing spaces for people to work. The furniture is just maybe a means to an end. But it's about understanding what type of spaces you need, how many meeting spaces, what should be the distribution, what type of spaces are conducive to what type of work. And so we really pride ourselves in that research-based design, to not only bring a combination of products that enable certain behaviors in the workspace but also make sure that the products that we're throwing out there are, from an engineering perspective, really, really solid.


Brian: Got it. And is the main software—when we think about data science then, is your work primarily in software, in terms of what I would call the design tools, right, that commercial designers or workspace designers would be using to figure out what does a company have 1200 whatever—an insurance company with 1200 people working in a downtown location, converting a space, you call those designers as well. We'll do some vocabulary to cover the space here, but do you [00:03:01 audio break]—


Jorge: Yeah, so historically, the main focus of the data that we use for our analytics is data that was used mostly for order fulfillment, and closing the books, and all of that stuff. Definitely not data that was designed for that second day of like, “All right, now I'm going to use this to provide analytics.” But data was created as a need to connect our systems so that we knew who had paid, who owed us, who ordered the pink chair and where did it have to go, and all of that. So, recently, we've explored different data sources, and I'm very excited, maybe, about sharing that a little bit more. But historically, there's been a lot of work just with traditional, okay, these are the different data sources that we use to run our business, now let's give the keys to the data scientists and see what they can do. And so it's definitely been a learning experience.


Brian: So, they handed you the keys, but what was the expectation set there?


Jorge: So, this takes us back maybe when the data science practices at Steelcase started formally. I would say that's probably about five or six years ago, maybe six years ago. And it was more around, “Okay, let's bring a set of really smart people, really quantitative, we have these proprietary software that we use for analytics and let's give them access to the data, and that'll solve our problems.” And obviously, there was at first this really big challenge of just understanding our data and make it suitable for second-day analytics. Bridging those data silos, right? I have this sales data, and this operations and manufacturing data, how do I bring them together? I got the data about pricing, but then how do I combine it?


and so, it was a really long time, longer than what I care to admit, of us just kind of like really taking the time to become very savvy at the data. And the value that we brought back to our shareholders, it's more about bragging, and in the sense of like, “Hey, check out what I can do. I can now tell you, this customer that bought this, you built it on this day, and this is the roll of fabric that you used for that.” And so not particularly actionable, but insightful. And I think that's maybe the first stage of analytics at Steelcase was just kind of bringing that momentum on like, “Here's the art of the possible. I now feel like I'm in control.”


And then came, maybe, the second stage, which is, “All right, now I'm able to answer questions from the business because I now know the art of the possible better, and I can pair with the business a little bit better to maybe frame the answers, and understand what I can answer and what I can't answer.” And so maybe this became the era of static data science at Steelcase. And the reason why I call it static was because it was very heavily focused on the business has a question that we frame it in terms of a hypothesis, and we're going to try to prove or disprove it. It's was very helpful, very helpful to support base decisions, make sure that our senior executives feel comfortable about certain big decisions, but obviously, as our data grew, our capabilities grew, and the needs of the business group, that started to become outdated, and we were challenged with having data science, more than enabling capability—by this I mean, “Okay, now, I don't necessarily need a report that stops at a PowerPoint, now. I need you to use a model to predict something and then I need those predictions to make other decisions.”


So, it's not, kind of, the end of the road. It's part of a process. And that transition was particularly interesting. But it also challenges us to think a lot more about the user. And so I think there's a really big connection to a lot of the value that you bring to organizations, unlike user-centered analytics, and this evolution really highlighted the need for that, which came in as a surprise to us, maybe.


Brian: And I understand you can only share so much about some of the internal workings at Steelcase and the projects you work on, but can you tell me a little bit about this last phase of the work and having to become more user-centered? So, what did it look like in the before time, and then what was it like afterwards? And what was the journey like that you had to go through to be—what does it mean to be more user-centered in your work? What did you guys literally change over that period?


Jorge: That's an excellent question. And in fact, I have a very particular example that is a great way to reflect all of these different stages. And I think it'll resonate with the audience because it's something we're all experiencing right now: this pandemic. And just how have we chosen to react to it to better serve our customers.


And so the story starts like this. We've always struggled with not being able to understand from our sales data and our traditional data sources, how our products came together. And the analogy that I like using is we would typically be able to look at the receipt of what they bought and say that you went to buy everything that you're going to use for your dinners throughout that week, so we would know everything that you bought. We didn't really know what meals you cooked.


And that was going to be something so insightful because we would be able to say a lot more about your preferences, and which ingredients that we sell would be more effective. So, we weren't particularly effective in doing that. But by clearly defining this as an enabling capability that we wanted to go after, we started exploring different data sources that could allow us to go after this exact thing: understanding how our products came together. And so this required us to go after data sources from things like our design softwares—or the softwares that our dealer designers use. And we were extremely excited, but the end insight at this point was, “Oh, my God, the analytics that we're going to have in terms of understanding our customers is going to be so much better.”


And I'm talking about maybe early February when we kind of hooked all the pipes together and data started flowing on kind of a pilot, and then COVID happened. And so obviously, priorities across the board changed, and you're just trying to understand what of what you were doing is still valuable, and what of you were doing is no longer valuable. But we immediately realized the potential that this [00:10:15 audio break] that we enabled would have to address some of the most important concerns that people are having around a pandemic. The first thing that happened when everyone went to work from home, and people were wondering whether the office was safe, was, “Okay, can we use this data to understand the current state of the North American office? Are workstations safe? Do they comply with standard social distancing metrics in terms of that six feet radius? Or maybe having enough division so that if somebody sneezes, and that person just happens to be sitting in the workstation in front of you, is there a potential for you to get contaminated should this person have any type of disease?”


And so this data really enabled us to make very specific metrics to understand the current state of the North American office. And we were able to come up with really powerful metrics on, “Okay, here's the distribution of distance between workers. Here's a distribution of the height of screens and divisions that they use,” and came up with really interesting metrics that were even used to inform state governments and even the federal government. Things like well over 92 percent of workstations will not comply with some of the standard social distance metrics, like maybe enough division or enough distance between workers. And so that was really powerful. But obviously, that in itself is, again insightful, and it's that kind of moment of, “I’ve been able to connect the data and understand how to translate it into things that are interesting, but now how do I make it actionable? What's the next step?”


So, the next step is okay, we need to understand, can we come up with the typical worst issues? The ones that are most representative of what our customers would have in their office spaces, and then we're going to take the time to retrofit them so that we can either add enough distance or enough division by optimizing the use of products to either maximize for privacy, maximize for the use of density. So, how could we help our customers bring people back into the office by leveraging the maximum amount of furniture that they have, but adding a couple of hacks and tweaks there to make it safe, right? But in reality, we didn’t really know whether what we were suggesting was at all safe, or whether it's created any sort of difference. And what we did is we partnered with MIT and their department of epidemiology.


And using the data that we've collected, we came up with samples of workstations that are representative of what our customers are more likely to have. And we retrofitted them, and then we put the retrofitted desk in the lab that basically simulates the sneeze of a person, or somebody coughing, or somebody kind of spitting a little bit while they're talking, and all of that. And we're collecting some really amazing insights that can quantify the extent to which certain retrofits work in disease transmission. And so it's not about being completely disease transmission free. You probably need to be, like, in a hyperbaric chamber, but it's a [00:14:11 audio break] decisions can I take to mitigate the risk? And so that in itself has been very, very exciting, right?


Brian: How did you guys figure out who owns the problem space of what does someone who's a designer—I guess if you already have an office full of furniture, I don't know who does this work, if it’s someone in HR, or whether they call up their design rep, or whatever, and they say, “What do we need to adjust to make our workspace compliant?” Initially, you could hear someone saying how, “Well, are we ready to go back to work or not?” But I would really wonder, as a designer, it's like, well, okay, we can give you a report out, and the answer is, “No, you're not, and 92 percent of the desks are too close together, and whatever.” But then it's like, well, what's the next question? “Well, what do we do about it, right?”


Jorge: Exactly.


Brian: So, I'm curious, how did you decide which problem are we're going to work on? Are we going to tell people what's wrong? Are we going to provide them with alternatives? Are we going to, you know, whatever it is? How did you guys go about that process? Who owns that problem of figuring that out, and what somebody needs. Tell me about that.


Jorge: Right, and that is a great question. And that is the challenge of transitioning from static data science into data science as an enabling capability. It's like, all right, we could probably pull off a really cool report of what we're seeing in terms of the current state of the North American office, and what we're hearing from MIT, but is that a competitive advantage? Is that enough?


And the answer is no, right? We can't take the time to email everyone. We can't take the time to go dealer designer by dealer designer, and analyzing every single case. And so this is when we said, okay, we need an enabling capability. What is that? Well, what if we could come up with a way to analyze a floor plan and provide recommendations of what are the retrofits that would optimize for something? Maybe it's optimized for distance, optimize for division, optimize for spread of a kind of virus or some type of pathogen?


And at the same time, okay, this is something that could really help dealer designers as they're engaging with their customers. So, we need to think about, what are they going to show their customers? They need to show what the current state is, they need to show the implications of doing nothing. Because everyone could achieve enough distance by just saying, “Okay, I'll ask half of my people or two-thirds of my people to work from home.” The challenge is, “How do I bring them back safely?”


And so we needed to think about an experience that our dealers could use to better serve the customers. And so, in a way, the experience in itself needs to be designed for the end-user, but it needs to be designed with the thought that it's our dealers who will use it. And so that is tricky. It's not as simple as it seems. And not only that, we also have to think, what is the… kind of how would it work? [00:17:47 audio break] or can we assume that all customers are going to call their office furniture dealerships and ask about how to bring people back into the office?


because, I mean, it might seem like the first thing that I can think of, but man, I've been working in an office furniture company for 10 years that I may be a little bit biased. But if you think about people, they might not necessarily understand that that could be a great way to figure out how to best support their employees. And so we also needed to think about, okay, we need to create leads for our dealers so that we can tell them out of your wallet of customers, these are the ones that are at most risk, either because of their type of workstations, their location, the nature of their business. And so we needed, again, to create this holistic experience.


But as you said, it was so challenging trying to iron out who owns this, because at first, we're just trying to get data out of a design software. So, you got data engineers, and you got some sort of guidance, and you have a data scientist that's kind of like exploring, is this enough for the analysis that we need to do? And then when you got the static component, I mean, data scientists can take it a very long way, and you might just need some business insights or partnering with a designer to iron out the details and make sure that you're kind of thinking holistically about the insights. But creating this experience, that… you need a squad. You need somebody that can see this as a product.


And so we took longer than I care to admit, to rally the troops, and form a squad to hit hard and fast this initiative. And the funny thing is, and I wonder if a lot of your listeners and practitioners experience the same thing. Sometimes everyone knows there's value, everyone knows it can be done, but just kind of aligning the resources, it just sometimes takes longer than it should. And it's because we [00:20:11 audio break].


Brian: Sure, and I think part of that comes down to knowing who you need to serve the most. In your case, this is—I just wrote about this to my mailing list, the Insights mailing list last week, what I call the two-hop challenge or the two-hop problem where, in your case, just for people that don't know, and correct me if I'm wrong, Steelcase you offer office products and desks and workspaces and all this kind of stuff, but the person that you work with closely, your customer in a lot of ways is the dealer designer, and their customer is the office manager, or the HR person, or whatever. So, there's two hops. And ultimately, you really want to make the final in the last mile, right? That last person is really the one that you serve the most.


But day-to-day in the tools and stuff that you create, the dealer designers have a lot of—they're a lynchpin. They're really critical to getting this stuff right. So, I think that's part of the challenge, right, is, who do we most need to serve, and how might their needs be different? One person might just want the answer? One other person might need to justify the answer.


It’s like, “Well, why are you trying to sell us this new chair, or whatever, this new desk system?” And so they actually need all this evidence to back that up so it doesn't sound like they're just trying to sell more product, they're trying to actually really help the business get back to work safely. So, did you guys figure out a way how you would frame that, and did someone make a decision like, “We're really here to serve our dealer designers right now, and the tools and software that they will use to provide this kind of insight,” or were you always thinking of it more from the actual downstream customer: the business, so to speak.


Jorge: Well, so it's funny that you mentioned, and this was probably a game-changer in terms of us kind of aligning in the right direction. We brought up a product manager that has historically been very aware of the needs of our dealer designers. And what we did is we said, let's draw the art of the possible, let's mock up the experience, and then let's go through some user labs. And we actually connected with a couple of dealer designers who volunteered to go through this kind of lab experience interactive, and that really tailored how we needed to show things, and to what extent are certain capabilities nice-to-have versus must-have. And that kind of created our backlog of analytics initiatives.


A lot of things were more like, maybe, design or how should we show things, but a lot of it, it's like, “Well, what is it more useful to you? Is it more useful for me to show you recommendations of compatible products, or maybe just showing you the numbers of—if you want to keep enough distance, you can only bring so many people?” And so that in itself was super useful. It's been the first project that I've worked at Steelcase that we've done this, and I'm going to say that we've done it right. Not necessarily implying that everything else that we've done, we've done it wrong, but saying that we really took the time to understand the user because, as you mentioned, we were facing a multitude of users.


From my perspective as a data scientist, we had our main stakeholder, who's our marketing leader, we also had our interior Steelcase designers, and that was pretty much the extent of users that we were used to deal with, but now we had to think about our dealer designers. And we also had to think of facilities managers that were making very important decisions on what they would need to do to bring their people back into the office.


Brian: Are you tired of building data products, analytics solutions, or decision support applications that don't get used or are undervalued? Do customers come to you with vague requests for AI and machine learning, only to change their mind about what they want after you show them your user interface, visualization, or application? Hey, it's Brian here. And if you're a leader in data product management, data science, or analytics and you're tasked with creating simple, useful, and valuable data products and solutions, I've got something new for you and your staff. I've recently taken my instructor-led seminar and I've created a new self-guided video course out of that curriculum called Designing Human-Centered Data Products. If you're a self-directed learner, you'll quickly learn very applicable human-centered design techniques that you can start to use today to create useful, usable, and indispensable data products your customers will value. Each module in the curriculum also provides specific step-by-step instructions as well as user-experience strategies specific to data products that leverage machine learning and AI. This is not just another database or design thinking course. Instead, you'll be learning how to go beyond the ink to tap into what your customers really need and want in the last mile, so that you can turn those needs into actionable user interfaces and compelling user experiences that they'll actually use. To download the first module totally free, just visit designingforanalytics.com/thecourse.


In that process, did you bring in—when you said that you had your user lab, so it sounds like you either did some, what we would call participatory design, or you actually did some testing of the designs, of some solutions, to get feedback. Did you pick—were those people coming from the facilities in the downstream, or were they coming more from your internal colleagues? Mm-hm.


Jorge: No, completely external. So, we reached out to dealers. By this time, when we chose to do this user labs and this kind of mocked up experiences, we had already formed a squad, a cross-functional team, different skill sets. You had people that are very good at programming, web development, data science, but you also have an experienced designer there, and also have a project manager. And so all those things had to come together. And that is the leap that organizations eventually have to make if they want to be able to create experiences that have a very heavy data science component, but that data science in itself is not the end goal.


Brian: Right, right. The experience of working with a product manager and a designer, can you tell me more about what that was like, and did you have any particular learnings? It sounded like broadly, something changed for you going through those, and I'm curious if there are any anecdotes that you could share about how maybe you totally went a different direction with something, or you perceived people needed this, but in reality, they had no interest in that. Any surprises or anything like that you can share?


Jorge: Well, I think sometimes, as data scientists, you can often feel that certain metrics are very relevant and just how you present things are clear. But in reality, people just want a yes or no. And sometimes in data science or in statistics, you're used to playing with probabilities or showing the extent to which something can be true or not true. But that can be challenging sometimes to end-users, and so we need to make sure that—as an example, if you're trying to say whether a variable is significant or insignificant, you don't just say “Yes” or “No,” you say, “Well, the p-value is 0.01,” whatever, and you're used to kind of playing with that by saying, “Well, the likelihood of it being significant is very low.”


But when data science is part of a process, you kind of have to be more cutthroat, and come up with rules of saying, “Well, if it's not going to be significant, then I'm not going to use it, so I need to put my threshold.” And so, it just requires connecting a lot more with the business, being much more pragmatic, coming up with business rules that often we don't think of, but it's kind of how we needed to evolve our thinking to make the flow work. You know what I mean?


Brian: Sure, sure. And yeah, I mean, it requires—I think one of the challenges is that, especially when you're dealing with probabilities and things like this, someone has to be the line drawer. Where do we draw the line between yes or no? And that's a really interesting challenge because I think people assume, “Well, data science will draw that.” You’re like, “No, this is a confidence question. This is partially an emotional question. This is a leap of faith—”


Jorge: Exactly.


Brian: “—it’s a business question, actually.” And I think it's really an everyone question because I think the data scientist can bring a level of understa—you can help them make the decision on what the thresholds should be by helping to understand the variables that go into the calculations and the predictions, but yes, I think that's an interesting challenge. And then you get into the, how much of this do we expose to the customer? Do they need that level of detail? Do they need a binary yes, no? So, I think those are really interesting challenges.


Jorge: Yeah. I mean, one example that comes to mind is, if we're going to make a recommendation of how to retrofit or reconfigure your space, we're developing an algorithm that would map the current space that they have with a template that would include the recommendation. And so, in order to do the matching, you need to do some type of similarity analysis. But the question is, how similar is too similar? How similar is similar enough for me to say that it's a match?


because in my world, I'm measuring distances, and I'm coming up with metrics that can help me assess the extent to which one thing is more similar than the other, but in their lens, they need to come up with a match regardless of whether that similarity was a 0.3, or a 0.9, or a 0.7, and so we need to take off our numbers hat and say, “Well, if you set the bar at 0.7, here's the things that could happen. Yet, if you set the bar at 0.9, here's the things that could happen. So, which trade-off is more important?”


Brian: Was that a difficult pill for your teammates to swallow? Or—


Jorge: I think maybe it's not that it was difficult, it was that we probably weren't used on that being the center of the conversation, and so we needed to come prepared to allow—people don't understand the implications of drawing the line, one area versus the other. You know what I mean?


Brian: Sure, sure. And just to give some context to make sure, because we're jumping into this conversation having talked about this before, so I want to try to draw a picture and you tell me if I'm wrong about what we're talking about in terms of software and a digital experience. So, my understanding is your dealer designers, the people that are kind of your direct one-hop customer, they use some design, like a CAD style design tool—I think it's called CET—and so this allows you to do 2D and 3D layouts of office space and to see the products placed in them, et cetera. And then this COVID-19, we'll call it a plugin or something, it’s effectively software built on top of that. It's like there's a Steelcase plugin, and now there's this analytics COVID-19 plugin that does this analysis. Did I get that right? It's kind of like an add-on to the CET software?


Jorge: Yeah, you're, you're in the right direction, I guess I don't want to overshare some of the capabilities but that's part of the experience that we're working towards.


Brian: Got it, got it. So, the overall goal here being that the person working these layouts can probably either know what to change about their layout, or they can see some alternate layout or something. You're going somewhere along those lines with the experiences, that are correct?


Jorge: That is correct. And not only that, we wouldn't just be recommending everything, we would be recommending things that have a scientific backup that that emerged during our engagement with MIT.


Brian: Got it, got it. I'm curious, from a subjective standpoint, you're working with designers and creatives as well, so likeability is a factor when we're talking about design, especially—even with office workspace and all of that. People have to feel confident, they have to like the choices, whether it's the designer, their client, everyone's not just looking for function. So, I'm curious, is there a quality level on this where maybe the Steelcase designers had opinions about, like, “Well, don't ever show them a recommendation that has this layout because it's healthy, but no one's ever going to accept that as a recommendation from us, because no one wants 10 desks lined up in a row like a schoolhouse. Just don't show them that.” Did you have any of those situations—


Jorge: Yeah.


Brian: —where they’re like—okay.


Jorge: Absolutely. I mean, I think the aesthetics component of the physical space is something that it's very hard to train a model on, and so you can come up with a proposal that is sound from a mathematical perspective but is not very logical from a interior design perspective. So, that's one of the reasons why we're very intentional, and kind of limiting what we suggest, versus kind of go out in the wild and surprise me kind of thing.


Brian: I can see how you can end up with—you don't have to tell us where it's going exactly, but you get into these hypothetical tools, where it's like your retirement planner. Well, save more, and assume this return on your investment, and then it runs all these simulations and you get some new thing. I imagine you had—you mentioned you had some kind of designer working with you on whatever this experience is, whether it's a hypothetical tool, or whatever it is, what was that like having design and product working with you? Was that challenge? Did it make your work easier? Did it make it harder?


Jorge: It definitely didn't make it easier, but it definitely didn't make it harder. I think it made it more valuable. And that is something that I've had experiences where projects don't really pan out. They don't make it beyond the kind of proof of concept phase. And it can be because we didn't really connect the dots the way we should.


And I think the organization is waking up and understanding that this cross-functional teams are what you need to really be able to squeeze the value out of data science as an enabling capability. And if we want to maintain our status as an industry leader, we can no longer just rely on having really good products; we need really good experiences. And in a market that is just so polluted and contaminated—and I don't mean this in a derogatory way—but by all these really cool experiences like Amazon, and Google, and you get all these recommendation, it’s just so easy to do business with all these companies. They obviously expect the same, and they can't often understand why is it so hard to do business with Steelcase when a couple of clicks I can order something on Amazon? I don't understand? And so we need to step up, and I think it's the expectation of this more demanding consumer, that digital experiences are just as important as the value of the product that you're making.


Brian: Mm-hm. Yeah, and I mean this gets back to the central theme of this podcast and the work. The last mile is where good data science work goes to thrive or die. If people can't take those insights and do something, put them to use, turn them into some kind of value, it doesn't really matter how accurate the modeling was, or any of the insights—even insights, it's like, “Well, what are you going to do with this information?” Sometimes the information can sound really interesting. Like, “Well, we can tell you how safe your office place is. You're an 82 out of 100. Good job.” Well, what does that mean? Does it mean it's safe? Should I invite everyone back tomorrow? Now you've put the question back on the user: “what do I do with an 82 out of 100 score? Or if you tell me I'm a 20 out of 100, what do I do about that? What is the risk to me? Do I need to redesign my office? Do I need new products?” That whole decision making aspect is really what it's about. If really push it out to the last mile, the whole experience comes down to that ability to take action on the insights that are provided.


Jorge: Exactly.


Brian: So, I think that great you guys are thinking about that. Sometimes people ask me, what are companies struggling with—and this is typically not the—non-digital-natives talk about this more than a tech company would, but cross-discipline teams, and multifunctional teams, it's just how do you do that? We really struggle to get all the people together. And sometimes I find it surprising that it's a challenge because it's like, well, how would you possibly design a tool for furniture designer—for workspace designers, and not ever talk to a workspace designer? I don't know how you could possibly be successful not doing that.


Jorge: Exactly.


Brian: So, tell me, what was that like putting together this—you call it a swarm team or something—but a team of, like, you had a designer, you had a product manager, you had the workplace designer. Was this hard to get those people together? And why was it so unusual to bring them together? Or wasn't it?


Jorge: You know, it’s—maybe the way I've been telling the story can come across as if everything was perfect, but man, was it a bumpy ride?


Brian: Yeah. What was bumpy about that? Share us?


Jorge: I think the bumpiness comes with the who owns it, and where does this stop being a data science project and becomes an organization—like when does it stop being a project and becomes a product? And so when you said, “Well, we can quantify the risk of your setting.” And that's something that data scientists can be very well versed in doing. And maybe we can kind of stretch our creative muscles and do a cute presentation and that might be enough, but taking this and transforming it into an experience goes outside of the scope of a traditional data science team.


And we've had other circumstances where we haven't been very effective in communicating that the reason why we can't deliver on something has less to do with the fact that it's not technically possible, and more to do with the fact that we just need other skills. And from other people. And so, it took, I would say, a lot of conversations in framing. And, this thing that we want to do to support our customers and support the country, bringing their people back into the office, it's not that we can't do it. It's just that it requires more than data science, and so let's sit down and understand what are the roles and the personas that would need to be involved for us to really pull off something here that's magical.


And then you identify those, now you got to find those people, either internally or externally, and bring them together and align. And so sometimes there's probably organizations that might have organizational structures that are very conducive for this, but at Steelcase, we had, obviously, people that are saying, “Well, you're going to steal this person from my team. I'm obviously packed with work, and now you're giving me one less person to do this. But is this kind of sci-fi or are you really doing it?” And so, things needed to trickle up until it reached a level, an executive-level where there could be those conversations of alignment and say, “Yep. We're all on board. This is it. This is—we're going.” We then trickle that back down, and set that swarm team.


And I think this is a learning for us to be able to identify opportunities that we might need something like this, and set the story right from the start, and put it in the hands of the right people that need to make the right decisions, so that the team can form. And rather than us taking three, four, five weeks coming up with this team after we already knew that this was possible, and we already knew that there was a lot of value, maybe we can have a situation where it takes us a week to recognize that and then boom, you're rolling.


Brian: Right. Did you feel like the challenge was that you had to convince someone that you needed some other skill sets, or that you had you guys were coming to the understanding of what you were missing, and it was more the time for you to figure out what your request for resources looked like, and then to present it? Was it more of like a battle with getting the resource approved, so to speak?


Jorge: I think it was both. I think at first, it was the feeling of falling short after doing something that you're very proud of, but saying, “Yes, but how is this going to help our customers bring their people back into the office?” And then the second one is when you're able to connect the dots, first there's this sense of—I personally feel uncomfortable because you begin to understand, “Well, in order to pull this off, I and my team can't do it by ourselves.” So, it's not something that we can kind of raise to the occasion, and save the day. We need someone else and we need help. And that might be a vulnerable moment for maybe a data science team that was used to just going solo and being the hero. But then it's about framing. “All right, so what do we need?” Well, let's frame our ask, and who do we need to get on board?


Brian: Sure, sure. Yeah, I mean, a lot of what you talked about to me is the—this is reinforcing this reason why I think the data product manager role—there's some different names for it—I think is a really critical one because it's not a great use of the data sciences team, who you have a very specific skill set that not a lot of people have, and your best, probably, spent doing data science work and not doing resource planning work for digital experiences—


Jorge: Exactly.


Brian: —and this is exactly what, typically, a product owner would do—sometimes as a product designer wears both the hat of the product owner, and that—but whatever their original skilling is, isn’t matter. The point is, they are a hub, and you can see how Steelcase could really look at this as a product, a digital product, and it's really we are owning the back—it's not the back to school, it's the back to work experience, and this ties back to eventually selling furniture potentially, or developing long term relationships for the next time there's an office refresh or whatever.


The point is, it's a real—it's a concrete thing. You can put some boundaries on it, it's distinct, it needs different skill sets from there's a user experience component, which has to do with making sure people can actually take the data science insights, take the research from the MIT people, and put in to play, whether it's through simulations, or just a readout that justifies something to a senior level stakeholder to say, “We need some budget and time to go and fix the office. Here's the proof if you need it.” Those skills are—there's lots of different skills there, and so I think this product mentality is really important. Even if you're not going to sell this thing as a standalone thing, which is what I think we tend to think of product management means you own this thing that has a commercial aspect to it, and it doesn't need to have that in order for the product mindset to help you get the focus and clarity that you need.


So, and this whole [00:45:56 unintelligible] came through, I think that's normal. It's always like this. It's never a straight line from A to B. I think it's normal, companies go through their own organic way of arriving at these things sometimes. And so I don't think that’s—I think that's the normal way to do it [laughs] actually. Even if it's frustrating the first time you go through it, but—what would you do—


Jorge: But I mean, all roads lead to Rome. I mean, it's all pointing towards this side, so we know it's the right way. The challenge right now is, how do we learn from this to set up a system where we can swarm at things faster, where we're more efficient at understanding the needs, and getting it in the hands of the right people to make those important decisions fast? But it's all pointing towards data science as an enabling capability; it's a team sport.


Brian: Mm-hm. Absolutely. What would you recommend to someone in your shoes going forward? Like if they could not make the quote ‘mistakes’ or save some of the time that you had gone through, what would you advise people in your shoes?


Jorge: I think one of the most important things is, if you're a data science practitioner, you need to recognize and draw the line where you can take this and design an experience without the fear of saying, “Well, but I can't design that because I won't be able to do that.” Just focus on what is the value for the user, and then make sure that that connection and that partnership with your stakeholder or your business partner is there. And so you can design that together without the fear of saying, “Well, yeah, but that's no longer data science.” Which, maybe, was something that we feared for a little bit of saying, “Well, you could you could do it this way and that way, but then that—”


Brian: You won’t need us.


Jorge: “—something different.” Exactly.


Brian: [laughs]. I understand. Jorge, this has been really educational for me, and I'm sure our listeners, so I really appreciate you being so willing to open up about your experience working on this new digital experience at Steelcase. So, thank you so much.


Jorge: Yeah, no. Thank you so much for the invitation, Brian. Again, had a blast. And I hope your listeners enjoyed some of this conversation.


Brian: Yeah, I'm sure they will. Where can people learn more about you? Do you have a website or LinkedIn, Twitter, some kind of way to just be in touch, and follow your work?


Jorge: So, I'm available in LinkedIn for sure. If you want to learn more about Steelcase, steelcase.com is a great resource. But more than happy to connect, if anyone wants to have a follow-up conversation.


Brian: Awesome. Cool. Well, I will definitely link up your LinkedIn profile, and thanks again, and good luck as you guys roll out this new experience. So, thanks for coming on the show.


Jorge: Thank you, Brian.


Brian: All right.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe for Podcast Updates

Join my DFA Insights mailing list to get weekly insights on creating human-centered data products, special offers on my training courses and seminars, and one-page briefs about each new episode of #ExperiencingData.