083 – Why Bob Goodman Thinks Product Management and Design Must Dance Together to Create “Experience Layers” for Data Products

Experiencing Data with Brian T. O'Neill
Experiencing Data with Brian T. O'Neill
083 - Why Bob Goodman Thinks Product Management and Design Must Dance Together to Create “Experience Layers” for Data Products
/

Design takes many forms and shapes. It is an art, a science, and a method for problem solving. For Bob Goodman, a product management and design executive the way to view design is as a story and a narrative that conveys the solution to the customer. As a former journalist with 20 years of experience in consumer and enterprise software, Bob has a unique perspective on enabling end-user decision making with data.

Having worked in both product management and UX, Bob shapes the narrative on approaching product management and product design as parts of a whole, and we talked about how data products fit into this model. Bob also shares why he believes design and product need to be under the same umbrella to prevent organizational failures. We also discussed the challenges and complexities that come with delivering data-driven insights to end users when ML and analytics are behind the scenes.

In this episode we cover:

  • An overview of Bob’s recent work as an SVP of product management - and why design, UX and product management were unified. (00:47)
  • Bob’s thoughts on centralizing the company data model - and how this data and storytelling are integral to the design process. (06:10)
  • How product managers and data scientists can gain perspective on their work. (12:22)
  • Bob describes a recent dashboard and analytics product, and how customers were involved in its creation. (18:30)
  • How “being wrong” is a method of learning - and a look at what Bob calls the  “spotlight challenge.” (23:04)
  • Why productizing data science is challenging. (30:14)
  • Bob’s advice for making trusted data products. (33:46

Quotes from Today’s Episode

  • “[I think of] product management and product design as a unified function. How do those work together? There’s that Steve Jobs quote that we all know and love that design is not just what it looks like but it’s also how it works, and when you think of it that way, kind of end-to-end, you start to see product management and product design as a very unified.”- Bob Goodman (@bob_goodman) (01:34)
  • “I have definitely experienced that some people see product management and design and UX is quite separate [...] And this has been a fascinating discovery because I think as a hybrid person, I didn’t necessarily draw those distinctions. [...] From product and design standpoint, I personally was often used to, especially in startup contexts, starting with the data that we had to work with [...]and saying, ‘Oh, this is our object model, and this is where we have context, [...]and this is the end-to-end workflow.’ And I think it’s an evolution of the industry that there’s been more and more specialization, [and] training, and it’s maybe added some barriers that didn’t exist between these disciplines [in the past].”- Bob Goodman (@bob_goodman) (03:30)
  • “So many projects tend to fail because no one can really define what good means at the beginning. The strategy is not clear, the problem set is not clear. If you have a data team that thinks the job is to surface the insights from this data, a designer is thinking about the users’ discrete tasks, feelings, and objectives. They are not there to look at the data set; they are there to answer a question and inform a decision. For example, the objective is not to look at sleep data; it may be to understand, ‘am I’m getting enough rest?’”- Brian T. O’Neill (@rhythmspice) (08:22)
  • “I imagine that when one is fascinated by data, it might be natural to presume that everyone will share this equal fascination with a sort of sleuthing or discovery. And then it’s not the case, It’s TL;DR. And so, often users want the headline, or they even need the kind of headline news to start at a glance. And so this is where this idea of storytelling  with data comes in, and some of the research [that helps us] understand the mindset that consumers come to the table with.”- Bob Goodman (@bob_goodman) (09:51)
  • “You were talking about this technologist’s idea of being ‘not user right, but it’s data right.’ I call this technically right, effectively wrong. This is not an infrequent thing that I hear about where the analysis might be sound, or the visualization might technically be the right thing for a certain type of audience. The difference is, are we designing for decision-making or are we designing to display the data that does tell some story, whether or not it informs the human decision-making that we’re trying to support? The latter is what most analytics solutions should strive to be”- Brian T. O’Neill (@rhythmspice) (16:11)
  • “We were working to have a really unified approach and data strategy, and to deliver on that in the best possible way for our clients and our end-users [...]. There are many solutions for custom reports, and drill-downs and data extracts, and we have all manner of data tooling. But in the part that we’re really productizing with an experience layer on top, we’re definitely optimizing on the meaningful part versus the display side [which] maybe is a little bit of a ‘less is more’ type of approach.”- Bob Goodman (@bob_goodman) (17:25)
  • “Delivering insights is simply the topic that we’re starting with, which is just as a user, as a reader, especially a business reader, ‘how much can I intake? And what do I need to make sense of it?’ How declarative can you be, responsibly and appropriately to bring the meaning and the insights forward?There might be a line that’s too much.”- Bob Goodman (@bob_goodman)  (33:02)

Links

Transcript

 

Brian: Welcome back to Experiencing Data. This is Brian T. O’Neill, and today I have Bob Goodman on the line from Virgin Pulse. What’s going on, Bob?

Bob: Hey Brian. Good to chat with you.

Brian: Yeah, I’m looking forward. What the heck’s Virgin Pulse? Tell us about that, and what does the SVP of Product Management do at such a place?

Bob: Virgin Pulse is a global health and wellbeing solution for employers, as well as health plans, payers, and providers. And it centers on helping people be healthier and happier in their everyday lives.

Brian: And your background, you’ve led product management as well as UX teams. So, you’re kind of this hybrid type person. Can you talk a little bit about that in the context of the current work that you’re doing, or even just the past, how it might have been affected by that? But how does design and UX come into play when you’re thinking about product?

Bob: For sure. Well, I have about 20 years of experience in consumer and enterprise software, really coming from the web [laugh] onward, and sort of stumbling into this line of work. But I think looking back, I’ve always been very interested in products and destination experiences, especially design-wise. And today I’m really overseeing both product management and product design as a unified function.

And I think you’re asking how do those work together? Yeah, I think, you know, you can—there’s that Steve Jobs quote that we all know and love that design is not just what it looks like but it’s also how it works, and when you think of it that way, kind of end-to-end, you start to see product management and product design as a very unified.

Brian: I’ve been surprised lately, even some guests on this show see those roles as not being overlapping, [laugh] [crosstalk 00:02:40]—

Bob: Yes, yes—

Brian: —at all.

Bob: —for sure.

Brian: I tend to see them being heavily overlapping. And when I teach my seminar, a lot of this stuff, some designers will say, “This is more like a product management thing.” Well, to me, really, it’s on a spectrum. It’s like autism or something, right? Like, “Where are you on the spectrum?” Like, “I’m a little bit more of a design-y product management type person.”

But I’m curious if data changes anything about this. So, we were going to talk a little bit about Virgin Pulse Health Analytics and there’s also this APH, this Advanced Plan for Health company that you were going to talk about, and I’m kind of curious, like, in the context of data, how does design come into play? How does product come into play when we’re talking about analytics, at least how you guys are rolling with it at Virgin Pulse?

Bob: Yes. So, I think that—I mean, I’m still sort of processing your point, which I have definitely experienced that some people see product management and design and UX is quite separate, and I think they might think that the product is saying really, like, what should happen and what sequence of—what’s most strategic, and the roadmap, and then they’re sort of specifying things tightly or loosely, and then design is maybe handling the UI. And it’s a little different if you think of design as a strategic function also, that’s shedding a light on kind of user motivation, or buyer motivation; then you might see it as more continuous, you might see design as a strategic end-to-end function. It does wind up as UI for sure and that’s part of a two-part stool—you and I were talking about, oh, there’s a three-part stool if you have design, product, engineering, and there’s maybe a four-part stool when you have data.

And this has been a fascinating discovery because I think as a hybrid person, I didn’t necessarily draw those distinctions. So clearly, I didn’t realize that people saw them as so separate. From product and design standpoint, I personally was often used to, especially in startup contexts, starting with the data that we had to work with, but that we might want to go after, and taking stock of the system and saying, “Oh, this is our object models, and this is where we have context, and this is where kind of offering detail, and this is the end-to-end workflow.” And I think it’s an evolution of the industry that there’s been more and more specialization, more and more training, and it’s may be added some barriers that didn’t used to exist between these disciplines.

And when I guess in terms of analytics at Virgin Pulse, you know, that there are many—we have analytics in a classic way around our reporting that is kind of client-facing. We also—we show a lot of data to our members as well because people are able to make sense of their daily lives and their activity and their sleep and their—and how that’s sort of tracking, but then we also are increasingly involved in kind of health outcomes and population health, and then you start to go into more of a big data mindset for kind of personalization and, like, risk-scoring and understanding how that can inform what’s recommended to people, and what’s presented. And so I think there are many facets to the sort of job to be done—to use a UX-y term—around analytics and data.

Brian: Yeah. Can you tell me a little bit about how those teams might work together? You talked about the, you know, risk-scoring; when you’re doing forecasting prediction, this is the domain of data science. Do you staff designers on the data science teams? Do you have data scientists as a role on a product team? Where’s the tip of the spear? Is it being led by data? Is it being led by products? Like, how do you see that? Or is there no—is it just a triangle that doesn’t have really a point, [laugh] like, any one point?

Bob: Right. I think that—and I’ve come to understand that this is a sort of common maturity cycle—we are in a process of really unifying and more centralizing our data expertise and data model, and from that centrality, if you will, to also really embed into our delivery teams, while still having a central team structure, and expertise, and practices to derive from, and really understanding how that can drive [velocity 00:07:10] from the standpoint of the sort of data platform that helps power our products and our services in a unified way. And, you know, I think that, you know, you ask a critical question about, like, how is it organized, and there can be sprawl in some organizations, right, around how they’re approaching data, and then the different discrete data disciplines—whether that is data engineering, whether that is data architecture, whether that is data science—and each discipline can have a slightly different take and approach and expertise in that makes sense. But then when you also have an org, you know, itself not having clarity about even the different ‘Jobs To Be Done’ by data, that organizational design can be a barrier, really, to what you can achieve with a more central take. And that’s kind of where we’re headed today.

Brian: Got it. You talked about this Jobs To Be Done thing, and I think we talked about this when we first met and I’m very much on the same page with you in terms of understanding the why and having really good problem clarity in how, you know, in my opinion, so many projects tend to fail because no one can really define what good means—

Bob: Yes.

Brian: —at the beginning: the strategy is not clear, the problem set is not clear. You know, if you have a data team that thinks the job is—“Our job is to surface the insights from this data.” And you have maybe a designer UX person that’s like, “No, they’re these discrete tasks and objectives that someone wants to do in which data may provide a slice of that.” Like, “I need help making a decision about x, but I’m not here to look at the data set, really. I’m really here because I want to know, am I eating right? Am I resting enough?”

Bob: For sure.

Brian: “I don’t want to look at my sleep stats. That’s not the objective is to look at sleep data; it’s to understand if I’m getting enough rest.” And I’m riffing here.

Bob: For sure.

Brian: I’m curious, like, do you find it’s a challenge to get—and most of my audience that’s listening here, they tend to come from the data side; I want you to talk to them about, like, your wish list. Like, what do data scientists need to hear from somebody like you about products, especially if they want to think about making data products and not just throwing data at people? What—

Bob: Right. That’s a great point. Yeah, and it is a sort of a common starting point to say, like, “We have all this data and so that is what we’ll show. And we need the screen to show it because that’s the data we have.” You know, and then I imagine—but you know, the audience is filled with people with more data expertise than I have—but I imagine that—

Brian: Me too. Don’t worry. [laugh].

Bob: —when one is fascinated by data, it might be natural to presume that everyone will share this equal fascination with a sort of sleuthing or discovery. And then it’s not the case, right? It’s a ‘too long; didn’t read’ TL;DR world. And so basically, like, they just want to have—often they want the headline, or they even need the kind of headline news to start, like, at a glance. And so this is where this idea of storytelling, and you know, telling stories with data comes in, and some of the research to understand the mindset that consumers come to the table with.

And this can be—this as business users as well, right? They don’t necessarily want to replicate all of the work of data analysis; they’d like as much of that as possible to come in with a rendering and with the screen and that for that to be ready to inform the decisions that their organization makes, or to share out. Because they too, are often, like, awash in data already from multiple sources. And then when you talk about, like, in our space, a space that’s so incredibly complicated as population health and wellbeing and conditions and health care, there’s just a tremendous need to make sense of things in an actionable way, in a data-driven way, but ‘data-driven’ means the meaning of it, and to try and take that lens.

And this, you know, Jobs To Be Done, or JTBD, which is from Clay Christensen, right? Like, it’s the great story in a nutshell, right, that they found that I think it was, I think it was McDonald’s, that you had consumers not buying—especially the drive thru, they found that for some reason, milkshakes were a top breakfast seller. Not the prescribed, you know, breakfast meal or breakfast Happy Meal; no, the milkshake. Why? Because people need something that is holdable in one hand, that fits in their cup holder, that lasts their commute, that gives them a sort of fun experience, so it just fit, functionally; it’s a functional, practical frame on things.

And I think that’s—it’s helpful to—that’s why this Jobs To Be Done phrase, you know, as to like, the problem we’re trying to solve, it’s helpful to understand the practical constraints that people are under, and the context then that your experience your offering can fit in.

Brian: Are there any strategies or tactical things that you’ve learned that either your product managers or your design staff need to adjust to work better with data science colleagues, and also the vice versa, which might be more of a wish list for you? But what are the things I would like to see change on the data science side to better work with us? Can you give me a—what do we need to learn from each other perspective?

Bob: You know, so there are these rallying cries of measure what matters, right? And we’re talking about, you know, a context at scale, and kind of durability—and there’s some durability of the context. So, when we’re talking about—and there’s also challenges of data availability. And then on the product management side, I’ll say a kindred or corresponding, you know, challenge sometimes can be what’s viable from, like, an MVP standpoint? Like, what is that what’s really compelling as a market offering versus already spoken for? And there’s what’s viable, and then really, there’s what’s feasible.

So, feasible is like, what do you actually have capacity for within the timeframe resources? These kinds of things. And it’s helpful, when one can, to separate those and not have them be so intertwined together because you don’t want to decide to do something that’s not viable just because it’s feasible, right? [laugh]. And you don’t want to prematurely narrow in that direction.

And I think this premature narrowing can happen. Like, “Well, we have this data, so it’s the best we have.” And then it’s like, “Yes, we do have that data, but it’s not any data that someone could act on.” Possibly a company can’t act on it because they don’t know what the implications are, or possibly a person can act on it because they don’t know what the sub-factors are in terms of their behavior. Their sleep, their stress, their eating, their condition, management, all these factors.

And so, I think that sometimes things could be I’ll say data right, like, the data is sound, but not, like, kind of user right, using users broadly. And vice versa. You know, feasibly you might aspire to do x and y and z for users and have this, you know, glorious—you might have want this glorious eye candy of data visualization, that’s this wheel, but there isn’t really a logical basis to deliver that.

And so I think as to advice, useful, actionable advice, there are a couple different ways to come at this. One is to try and fast-forward to an end state, to sort of beam into the future and say, “This is the story that we’d ideally like to tell if we have the—presume that we would have the data to do this or a way to do it. And here’s why.” And to get from learning about that, really, like, if you looked at this, would it be meaningful to you, you know, at a glance? And then also see what you do have to work with [laugh] and can you meaningfully, like, you know, step towards that vision in a way that still has viability, but you’re still kind of moving in time; you know, you’re not idling.

And that’s a lot of product management, versus a more clinical and longitudinal kind of studies and things, it’s a lot about, like, continuing to make headway in time. And that’s a lot about, you know, the agile mindset and software development life cycles and these sorts of things.

Brian: Got it? Got it. Yeah, you were talking about this, like, it’s not user right, but it’s data right; I call this technically right, effectively wrong. And this is not an infrequent thing that I hear about where the analysis might be sound, or the visualization might technically be the right thing for a certain type of audience, but the difference there is, are we designing for decision-making or are we designing to display the data that does tell some story, but it may not enable the decision-making that we’re trying to do, which so much of analytics is often about making the next best decision based on previous information or possibly predictions. So, I wonder if you could tell me a little bit about like that dance between maybe in the context of the Virgin Pulse Health Analytics, did you have this, like, kind of battle between, like, “But can I make a decision?” And, “Well, that’s the data we have?” And it’s like a tennis match? Like, and maybe it’s not a match; that sounds like we’re against each other, but was there a back and forth that has to happen here to kind of slow down the data presentation part to let the user catch up, like, kind of keep the user in check? Like, that’s kind of in my experience sometimes, but tell me about yours. How did you approach it—

Bob: Yeah, I—

Brian: —the design of that—

Bob: I think that—

Brian: —solution?

Bob: I think, you know, currently right now as we try and, you know, we’re working to have a really unified approach and data strategy, and deliver on that in the best possible way for our clients and our end-users, for the most productized part of our data delivery, we’re really optimizing on the meaningful part of it, as you said. And, you know, there are many solutions for kind of a custom reports, and drill-downs and data extracts, and we have all manner of data tooling. But in the part that we’re really, like, productizing and having experience layer on top, we’re definitely, you know, optimizing on the meaningful part versus the display side, and the best meaning that we can deliver at this time and build from there, if that makes sense. And that’s maybe a little bit of a less is more, if it has meaning.

Brian: Can you tell me, like—give me an idea of kind of how that process worked. Like, if I load the dashboard, or whatever, the home screen of VP Health Analytics, how did they arrive there? And what were the roles of, like, the data science people versus the designers or the product? Like, tell me a little bit about who decided what went there, and what kind of chart it was and what—

Bob: Sure, definitely.

Brian: —decisions is it trying to drive towards? And how did the teams play together to get to that? Like, anything you can shere there would be great.

Bob: Yeah, for sure. Well, without, you know, without overcomplicating, like, the VP portfolio, Advanced Plan for Health that you mentioned is a fantastic standalone population health and, like, data product. And that’s something that we are—you know, that’s a new acquisition for us, and that allows us to really intake populations for risk-scoring and opportunity—and what we call, like, the—like, opportunity index of which parts of the population and which conditions are most susceptible to impact, intervention, treatment. But because, you know—we are Virgin Pulse, you know, has had many acquisitions, you know, we then—now that they’re part of the family, we’re really taking stock of how the parts of our offering fit best together.

Similarly, we have our long-standing kind of bread and butter, Wellbeing and Health, a platform, and for that part of it, I started to weave a bit of a yarn on our portfolio of offerings. For that a part of it, that’s the area we’ve really had an aligned view across engineering, data, and design, and working in tandem, to set the stage for, kind of like, a unified enhanced analytics approach. So, that is—that’s dashboards that we’re building, and a way to lens in, I would say, other parts of our data offerings. And I think we’re actually quite common amongst, kind of, enterprise scenarios where there’s a wide array of data tooling that we have in play. It’s, you know, a number of third-party tools and homegrown tools, but I think that ultimately, as you were alluding to, you know, people would prefer not to navigate amongst multiple tools as a consumer of data when trying to make sense of it.

So, that’s really the purpose of our enhanced analytics portal. And so that’s a dashboard that tries to work almost like—so I’m also a former journalist, so I’ll use the kind of the inverted pyramid: you try and tell the biggest story—the—first, and you sort of like work your way down in detail. So, you know, we try and have the headline news front and center, and then, like, additional reports down below. And so in our case, in our domain or category, when you’re talking about a populations health, and managing that population health, and designing a programming and solutions accordingly, you’re saying, really, like, you know, how are we trending, what’s the nature of the risks we face, and how much of it is affecting, and you start drilling down from there.

But first, you’re trying to get a top sight or a bird’s eye view. So, that part, Brian, was a very, really design-driven approach, a bit more blank sheet of canvas to zoom out from these array of things that we could do, to see what’s most compelling from the perspective of people that are trying to use this to share or make decisions about, you know, about how their population is trending and what they can do to drive the most positive outcomes. And on our platform on the part that is data-driven, but not analytics specific, you know, we are a we’re also a gamification and rewarding system, and we do virtual challenges, and we do habit tracking, and so there’s a lot of there’s a lot of things that—levers that we can bring to sort of solve for risk and engage a population.

Brian: Mm-hm. Can you talk to me about the involvement of customers in that process of working on these data products, but also I think I read something you wrote me on LinkedIn about, you know, “One of the most valuable lessons I learned from running hundreds of usability studies in my career is being wrong, the joy of being wrong.”

Bob: [laugh].

Brian: Can you talk to me about that blank slate, getting something out there, and then getting in front of users, how often are you doing that? And talk to me about the joy of being wrong because I don’t feel like a lot of the data professionals, I know—I think this is scary. Like, “I went to school, I have a PhD in math or whatever, and I’m pretty sharp, and I know my stuff. And the idea of being wrong is scary.” But talk to me about being wrong as a method of learning and, like, how fast can we rapidly learn? And it’s not about being wrong; it’s about the change and knowledge is the gaining of knowledge, right? And what—tell me about that in these products.

Bob: Yes. Yes. So, we have a very strong, you know, quant and qual research team, and you know, it’s well aligned to our product management and our design team. And so, you know—and there are many research methods, and these apply to very data-centric products as well as things that are just more, you know, a bit more consumer-y in nature and more task-oriented.

But, you know, so I mean, to call out some methods, and I can try and share some personal war stories from the olden days when it’s more in the trenches, but—

Brian: [laugh].

Bob: —[laugh], you know, one is that basically, you know, creating, like, one is doing prototypes—those can be fully-functional prototypes, they can be simulations—and one is showing those to people, one by one, live for kind of qual analysis, and ask them to, like, think aloud and see where they would click and express what they’re perceiving, what they’re thinking. What they’re perceiving is, in my mind, a really important thing bec—you know, and when I used to moderate these I’d say, like, “Okay, before you click there, what do you expect to happen next? And why—and what do you think you’re seeing? And why would you click there?” And one can even do this, by the way with, like, paper [laugh] paper prototypes.

And you find when going through this time and time again that, like, what you thought the interface was intending to communicate is not coming across to them for the variety of reasons, or they’re really misinterpreting it, or something has kind of given them a false signal that you didn’t intend. And it could be so many things. Sometimes it’s your color choice, you might have a color choice that you think is just, you know, just on brand [laugh] and that’s given—and they think that color is a meaningful one—if it’s red, or green—having functional purpose, exactly. And so then also, they might in terms of just then, classic usability, like, where they’re where they click. And this is also especially sometimes in, you know, data visualization, it may not really be clear, like, what to do to activate.

It’s not always crystal clear, like, where you want to drill down, and what is it, and what changes the view and what changes the sort, and what persists the sort, and what persists the filters, and did you lose that view, if there’s a lot of—[unintelligible 00:26:02], you know, on, like, BI-type tooling, if you can kind of interrogate the database. So, all this cognitive walkthrough that people can give you. And of course, you can pick up these signals at scale on quant testing, and A/B testing, and behavioral analytics tools, which we also use. Like, we use Mixpanel, and this sort of thing.

So, those can tell you a lot about what is going on. They can’t always tell you why it’s happening. And so quant and qual can really pair together very well, right? The data kind of tells you what and then the people shed a light on why when you’re in pursuit of it. And those together, add up to a complete package in order to refactor, I think.

Brian: Did you have any particular anecdotes you recall on any of the data products were like God, the whole team thought that this was going to, like, just totally kick, and then we got in front of people and everyone thought, “Oh, this data means this,” or, “Oh, it’s telling me to go sleep more.” It’s like, “No, no, no. It’s not.” Any anecdotes you might share about, kind of, lights going on, especially for maybe the data team. Maybe they were very insistent that something was right. And it—

Bob: Yeah, for sure.

Brian: —wasn’t?

Bob: Well, this one, the story is a little bit more on the consumer side, but we have a challenge offering that we really like that it’s called Spotlight Challenge. And it lets people—we have a lot of classic challenges that are very competitively-oriented, and there are leaderboards, and there are teams and there’s, you know, challenge boards to, kind of like, shout out and—but we want to create a something that would also have an aspect that would be inclusive to people that might be less inclined to competitive, highly competitive environments. And so we call it a Spotlight Challenge, and it has a personal goal dimension where you’re kind of competing against yourself, [laugh] your own goal and your own history in a certain, like, physical activity, to see how you’re tracking there. And then the leaderboard is a sort of novel visualization in that you’re seeing how other people are doing against themselves, against their own goals, personal goals. Now, where we landed there was, I would say it was a long journey to try and get to concepts that really take hold.

The idea of competing, like, against yourself for a personal goal was like an elusive one, and we had to just be really explicit in the kind of callouts, in part because people are so inclined to read progress as just a very literal thing, a progress against just a straight up quant: “I ran x many, and that’s my progress.” Not kind of medical—metaphorical progress. And so even for people to understand the nature of this challenge as unique and working differently than competitive challenges, yeah, we had many, many stops and starts on the data to signal that it was something different. And I actually think that this is—I think this is a common design in [unintelligible 00:29:05] that people are primed that they bring to the table a certain expectation. And so even though you might think your display and your approach is signaling, like, “Oh, this is similar, yet different,” they’ll go with the similar.

And there is something in the UX and psychology world, right, and human factors world of, it’s sometimes called, like, change blindness, that you don’t actually detect the differences; you glide over them because you’re just trying to make sense of things in a rapid fire. So, we really, in this case, had to just really accentuate what was different. We didn’t want to lose the fun [laugh] of the challenge, but we really had to bring people through stepwise on, like, “Okay, well, this is a unique thing.” They had to be—we had to pick up a little bit of, like, onboarding education at the start and really emphasize this new object that we were bringing to bear.

Brian: Got it. Got it. Thank you for sharing that with us. I’m curious, I know the Advanced Plan for Health, I think—

Bob: Yeah.

Brian: —that’s the one that does some of this forecasting and risk-scoring—

Bob: Yes, absolutely.

Brian: —all of that?

Bob: Yeah.

Brian: So, I’m curious, does anything change about your approach to either the way your product management team works or your design team works when they’re working with machine learning and AI, probabilistic solutions that are making forecasts that are not going to—they can’t read the future perfectly, is there anything that you’re changing about the way you approach that and how you staff for it, the skill sets required to work with data scientists and that on these types of things that’s different?

Bob: Yeah, that’s a great question. So, we have data offerings that are really for decisioning and program design upstream, and then we have data offerings that are really, yeah, informing our AI and our personalization layer. And on that side, we have a new feature that we call VPIQ, and it’s based in part on deviation detection. But essentially, it can kind of intervene, almost like a way to slow down in your lifestyle and health, and sort of talk to you back and forth to see what’s slowing and how we could get you unblocked and what could be recommended to you.

And so I know that there was an aspect of your question that was around, like, the predictive analytics capability and I could speak more to that, but I would say that what’s changing is just trying to have a more embedded and cross-disciplined view where the data and the capabilities and the approach are part of the delivery team, and including, you know, the data science and the basis, and the ability to, like, test and learn, and the metrics are really being designed in. And I think it’s—you know, and that’s different than thinking that there is a software delivery team and then somewhere else is, like, a reporting team, for example. And so I would say that it’s this new, like, embedded approach that’s cross-discipline that’s shared within our R&D team, spanning product and engineering that is really an evolution of our data direction as well as our kind of data-driven acquisitions.

Brian: Got it. Got it. Kind of related to this—so just to kind of close out this topic—I think it’s on our screening call, you said that you thought productizing data science is tough. What did you mean? What did you mean by that?

Bob: Yes. It’s a good question. Today we have data science that helps us in multiple respects: it helps us, you know, for our metrics; it helps us with our research, and then I think that the thing that is challenging from an analytics perspective, when the end-user is trying to kind of consume the data—

Brian: [unintelligible 00:33:00]

Bob: Yeah, exactly, in terms of delivering insights is simply the topic that—to circle back—the topic that we’re starting with, which is just like, as a user, as a reader, especially a business reader, like, how much can I intake? And what do I need to make sense of it? And how declarative can you be, responsibly and appropriately, right, to bring the meaning and the insights forward? And how much—and there might be a line that’s too much, and this is what we have, and then we need to talk it through. And then there’s a service layer that becomes really important, and a consultative layer. And often that does need to be bundled with the most advanced kind of analytics and decision offerings.

Brian: Mm-hm. Got it. Bob, this has been really fun. Do you have any closing advice for data science, analytics leaders, or technical product management about making better data products that actually get used and are believed and trusted and hopefully help people make decisions? I mean, ultimately, that’s what so much of this is about. Any closing thoughts you want to share?

Bob: Yeah, I think that I think, from this conversation, it really comes down to a period of zooming out. We need to say, like, “Okay, who’s going to use this? [laugh]. What’s their context? What are they trying to do? And what’s the right level and layering, like, of the information based on what we have available?”

And then can we imagine that end-to-end? Like, under what circumstances did they come here to look at it? Were they assisted by that expertise or was that self-service? And what do they need to do with it, and how do they just share it, what decisions were made? And do they have a basis then to also track whether they made the right decisions, the right calls and see the impact?

So, I mean, I said some simple kind of common sense things, and I would say any of us, myself included right, can have, like, a discipline [laugh] or domain barrier to imagining these things. It’s like we’re, like, a tainted jury pool, all of us.

Brian: [laugh].

Bob: We really can’t—we already know too much. It’s very hard to [unintelligible 00:35:09] that which we know, and also un-know, like, our own expertise to then come back and take a fresh look at things. And so I don’t—and there’s no foolproof solve for that. You really can’t subtract what you already know. That’s really the benefit of, like, research or market feedback—quant and qual—is then to reveal to you that what you assumed or presumed was there, and where there was something, you know, not fully factored in. Some missing context.

Brian: Sure. Sure. Well, thanks. This has been great, Bob. Where can people, like, learn more about your products that you work on, and you? Like, what’s the best place to go?

Bob: Oh, sure. Well, I encourage everyone to, you know, check out, you know, virginpulse.com, and we have a lot of information about our platform, our products, our services. And then, I mean, you can find me on LinkedIn. There are other Bob Goodman’s, but I’m the only one at [laugh] Virgin Pulse, so that could probably help you look me up.

Brian: We’ll put the link there, so people don’t have to guess.

Bob: Yeah, for sure. [laugh].

Brian: Absolutely, yeah. Yeah, this has been super fun. Thanks for coming in and talking to us about your work, and your teams, and all this. It’s been great.

Bob: Thanks so much, Brian. Great to chat with you. Have a good one.

Brian: You too.

 

 

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe for Podcast Updates

Join my DFA Insights mailing list to get weekly insights on creating human-centered data products, special offers on my training courses and seminars, and one-page briefs about each new episode of #ExperiencingData.