Healthcare professionals need access to decision support tools that deliver the right information, at the right time. In a busy healthcare facility, where countless decisions are made on a daily basis, it is crucial that any analytical tools provided actually yield useful decision support to the target customer. In this episode, I talked to Karl Hightower from Novant Health about how he and his team define “quality” when it comes to data products, and what they do to meet that definition in their daily work.
Karl Hightower is the Chief Data Officer and SVP of Data Products at Novant Health, a busy hospital and medical group in the Southeast United States with over 16 hospitals and more than 600 clinics. Karl and I took a deep dive into data product management, and how Karl and his team are designing products and services that help empower all of the organization’s decision makers.
In our chat, we covered:
- How a non-tech company like Novant Health approaches data product management
- The challenges of designing data products with empathy in mind while being in an environment involving physicians and healthcare professionalsThe metric Karl’s team uses to judge the quality and efficacy of their data products, and how executive management contributed to defining this success criteria
- How Karl encourages deep empathy between analytics teams and their users by deeply investigating how the users being served by the team make decisions with data
- How and why Novant embraces design and UX in their data product work
- The types of outcomes Karl sees when designers and user experience professionals work with analytics and data science practitioners.
- How Karl was able to obtain end user buy-in and support for ?
- The strategy Karl used to deal with a multitude of “information silos” resulting from the company’s numerous analytics groups.
Resources and Links:
- Novant Health website: https://www.novanthealth.org/
- Novant Health LinkedIn: https://www.linkedin.com/company/novanthealth/
- Karl Hightower LinkedIn: https://www.linkedin.com/in/karl-hightower-4528123/
Quotes from Today’s Episode
“I tend to think of product management as a core role along with a technical lead and product designer in the software industry. Outside the software industry, I feel like product management is often this missing hub. ” - Brian
“I really want to understand why the person is asking for what they're asking for, so there is much more of a closer relationship between that portfolio team and their end-user community that they're working with. It's almost a day-to-day living and breathing with and understanding not just what they're asking for and why are they asking for it, but you need to understand how they use information to make decisions.” - Karl
“I think empathy can sound kind of hand-wavy at times. Soft and fluffy, like whipped cream. However, more and more at senior levels, I am hearing how much leaders feel these skills are important because the technology can be technically right and effectively wrong.” - Brian
“The decision that we got to on executive governance was how are we going to judge success criteria? How do we know that we're delivering the right products and that we're getting better on the maturity scale? And the metric is actually really simple. Ask the people that we're delivering for, does this give you what you need when you need it to make those decisions? - Karl
“The number one principle is, if I don't know how something is done [created with data], I'm very unlikely to trust it. And as you look at just the nature of healthcare, transparency absolutely has to be there because we want the clinicians to poke holes in it, and we want everyone to be able to trust it. So, we are very open. We are very transparent with everything that goes in it.” - Karl
“You need to really understand the why. You’ve got to understand what business decisions are being made, what's driving the strategy of the people who are asking for all that information.” - Karl
Brian: Welcome back to Experiencing Data. My name is Brian O'Neill, and today I have Karl Hightower on, and Karl is going to talk to us about data products. Karl is the chief data officer and SVP of data products and services at Novant Health. So, Karl, welcome to the show. How's it going?
Karl: Great. Thanks for having me, Brian. Really appreciate you having me on.
Brian: Yeah, what’s, uh… what the heck does an SVP of data products do?
Karl: So, my main responsibility, as you see by the chief data officer title is really the responsibility around managing data. As we look at advancing that into decision making it comes into products or services that may help people make sense of all of the data in front of them and really get to informational decisions, whether that's through analytics, reporting, or as we start to move into AI, cognitive computing, it's making sure that people have the right product at the right time for them to make decisions on.
Brian: And tell us a little bit, just for context. Novant Health, is this a health tech company, so the product is a software application or a suite of them, or that's not necessarily what the product is.
Karl: So, Novant Health is a hospital and medical group system in the Southeast United States. And we have 16 hospitals and we have 600 plus clinics, tremendous amount of employees across the Southeast and part of this is making sure that across all the systems that people have consistency, and they have the right information to make clinical and non-clinical decisions as needed in order to bring remarkable care for our patients.
Brian: Can you talk to me a little bit about this word ‘product?’ I mean, it's in your title and you're a senior leader here, so I tend to think of product management as something that’s—these days, it's a native role along with a technical lead and product design in the tech community. In the non-tech community. I feel like product management is often this missing hub; it's like these wheels are turning and there's often not a hub in the center of them. Can you talk to me a little bit about the context of product management in an organization like yourself, especially data product management? What does that mean, and why is this intentional that you use this word?
Karl: So, a product may have a finite life. So, having come from application development, and also non-technical roles in the past, the product, as we build and continue to iterate off of it may not last, but a useful life of six or nine months, or it may be there forever. So, it gets into a mentality around making sure that it is something that is delivered and iterated on, as long as the usefulness is there for the people that need to make decisions off of this.
Brian: Do you see that as meaningfully distinct from, say, an analytics group or a data science group?
Karl: So, it's more of the evolution thereof of those groups. So, as we start to take it from building X requirements for a report or a dashboard, we start to look at it as a portfolio of delivering products for people to make decisions off of. So, we take it to being able to build a product that may be, lets say, just a multidimensional cube. It may also be an AI-based decision product that is only needed for a certain amount of time until we move on to the next iteration thereof. But by focusing on products and services, as opposed to more specific requirements, what we're trying to do is to get a cross-functional group to think about the decisions that are being made off of it. So, it moves from very crisp and clear requirements to everyone involved in making sure that this product is successfully used to drive outcomes.
Brian: How do you ensure that it gets used and drives outcomes? Like some people say, “Well, yeah, all of our work does that.” Or, “Yeah, that's what the analytics group does. We do that too.” I think people that tend to think in the product mentality have a more intentional focus on that last mile experience, and making sure that there's an outcome and not just an engineering artifact, a thing is produced, but the thing actually has to provide some type of value. So, can you talk a little bit about how you get that engagement, and how do you drive for that?
Karl: So, that's a great question. As we start to look at—I would take it as more of a, “What do you tend to get? What are the user testing?” I really want to understand why the person is asking for what they're asking for, so there is much more of a closer relationship between that portfolio team and their end-user community that they're working with. It's almost a day-to-day living and breathing with and understanding not just what they're asking for and why are they asking for it, but you need to understand how they use information to make decisions.
And there are a lot more—or I'd say strategic alignment around that, and helping to become a part of their team without necessarily becoming a part of their team because we don't want them to get isolated in and of that group that they're working with, without seeing the bigger picture. So, there is very much a empathy-driven approach to seeing how people are making decisions, living, and working, and breathing with them. And also creativity around how do we make sure that the person has what's in front of them in order to actually drive that change that they're looking for?
Brian: And how do you bring empathy into the fold? Especially, there's a stereotype I think, particularly in the analytics community, in data science. It's kind of like, “We're very analytical, we're not creative, that's not what we're here for. We're here to do the hard math.” And so I think empathy can sound kind of hand-wavy at times, and it's soft and fluffy, like whipped cream.
But yet at the highest level at senior—what I'm hearing more and progressive—and maybe this is selection bias, but they're realizing how much these skills are important because the technology can be technically right and you can still be effectively wrong. So, how do you bring empathy into an organization, to your team? Is it through training? Is it the type of people you hire, just natively? How do you do that?
Karl: So, it is a hard one because I come from outside of healthcare. So, I put my teams in positions where they had to use the products in front of them. And that, of course, you start to use the products that you design, and that brings in more of a level of parallel reflectiveness on whether or not it's a good product or it's a not, or are there things that are useful within it. When you start to get into a healthcare clinical-based world, that is much harder to do. So, there's a lot of shadowing, there's a lot of working with, as much as can be expected.
But we've also tried to hire UX people, so that user experience based group that you see in an application team is now starting to manifest itself into analytics. And as we produce products, we have, kind of what I would say is more of a banter back and forth of playing, “Well, how are they going to use it? Do you understand how they're going to navigate? If they ask this question, what do you see them doing with it?” And then there comes into it is, there's an over-complexity of information that you can put in front of someone.
So, you can put as much information, line graphs, charts, all the things of that nature, but what does somebody do with it? So, if they look at it, do they know the decisions they're supposed to make? So, there is kind of a devil's advocate approach with how you're bringing these products out. So, a lot of times, what we see is that it's actually a simpler product, a simpler visualization, which people can act off quickly and really start to then drill into executing off of operational decisions.
Brian: In the design field, we talked about design maturity, and this is more for organizations that have design as a core discipline, but there's a maturity scale there which can range from kind of like, design is a centralized service where you call them in and they do some work and then they go away, then they can be embedded in your team, and kind of on the far maturity level is design that's actually integrated into everybody's thinking; and it's kind of everyone's job, but you might have a lead designer who's kind of driving that aspect of it, but it's not one person's job. So, is that the model you use, is trying to spread that capability out throughout the team? Or is it more a body comes in, and kind of provides that lead role and your technical people focus more on the execution and delivery of the technical aspects?
Karl: It is one of the things where you're going to have to seed it as you begin to go forward. So, we brought in a couple of people to show the way of, “This is a better approach.” Ultimately, we would like everyone—no matter what your job is, whether you're a data engineer or whether you're in the program management group—is to start to think that way. So, this almost becomes a, if we nurture it, if we show a good is, if we continue to reinforce that these are the things that are driving the positive outcomes, it then gets more of a sea change for us to drive the benefit of what I would call just simplification of decision-making.
Brian: Is that tough to justify that type of hire or role in your organization? Or was it not?
Karl: No. So, it's always tough to make a change when you go through the justification because you're looking at the basic BI roles of report writer, or dashboard creator, or data engineer, and you start to get into, I need a team. Let me go pick my team of people who are, what I would say, just responsible overall for delivering the product. So, if you give that to the product delivery lead, and they can understand what types of people they're bringing to the table, that functional delivery of the UX, and making sure that that comes out, then becomes part of their hiring. So, they understand that need, and they start to interview accordingly so that they can get people more aligned to that actual deliverable.
Brian: Let’s talk—I mean, you're, I think, in the general audience I've spoken to, you're more of the exception than the rule here, so I'm curious, what are the social dynamics like on your team, having designers and user experience professionals working with analytics and data science? Does that blend well? Or is it a learning process? How does that work? Those are new combos.
Karl: Well, it's definitely a learning process. So, as we go from the very traditional, just handing the order down the line to people doing their own individual part, you start to get a team together and some of the things that you've got to get out of the culture of, “It's not my job,” or, “Why is this person here? I don't really need them.” Part of it is to create a questioning environment to deliver a better product in the end.
So, as they bring their own expertise within the overall process, it starts to evolve a better process and a better product at the end, which then the team sees and they go, “Oh, okay, now I get it.” And now they start to follow that pattern more and more because they believe in it. They've seen the outcome. So, it goes with a few early sea changes that you just have to get them into and get them starting to work with and as other people see it and they see that success of the cross-functional team, they really start to buy in and drive at home. But it has to be a sea change. Someone has to put that in play first.
Brian: What was the core way that you decided, “Hey, this is a core competency we need to have on our team,” and was there a before/after a moment where you're like, “Okay, we can't do this without this,” or, “Strategically, we're not going to meet our business objectives if we don't have this.” Was there a moment when this changed, and you went to, I don't know, the CEO or whoever you had to say, and say, look, we're not doing this right, we need to have this, kind of, skill here. What was that? Was there a moment, or something, or story or experience that happened that you said, “We got to start trying this differently?”
Karl: So, me coming from a non-analytics background, and really just leveraging data within decision support systems in different industries, you really got to see how people were using the systems, we’re using information to make those decisions early on, whether it was call centers or retail point of sale systems. So, as I came into this, I had a lot more latitude to bring the level of change needed. And there's a tremendous amount of data in healthcare. There's a tremendous amount of things at people’s disposal to make decisions, and as I worked with some of the senior leaders to understand what they had at their feet to make those decisions, it was evident that it was a lot of Excel, it was just a lot of information there. And as we started to work through some of the more, I would say, visually pleasing material, we took the exact same Excel, we put it into the visualizations, and what we could see was, “Oh, I get it. I see three things on this sheet right now that I need to go do.” And it became a visual representation, instead of people having to sift through the matrix of Excel sheet after Excel sheet.
Brian: Yeah the—you can get all the—I provided exactly the data they said, and at the end of the day if a decision is not made, that technical work doesn't matter. So, I think, asking why, how you're going to use that, those are critical skills, with the visualization being the formal response to all of those inputs, and questions, and the why questions, and all of that. So, it's interesting to hear the approach you took there. You said, when we initially talked on the phone about you coming on the show, you'd mentioned you had recently had, I think it was an executive governance call, and there was kind of a mandate that came down. And I was wondering if you could share what that was around asking people about, were you able to solve problems with these tools and applications we create? Could you recount that?
Karl: So, the decision that we got to on the executive governance was how are we going to judge success criteria? How do we know that we're delivering the right products and that we're getting better on the maturity scale? And the metric is actually really simple. Ask the people that we're delivering for, “Does this give you what you need when you need it to make those decisions?” It's a straightforward, “Did you get everything that you needed in order to fulfill your strategy, and actually to drive change?” Black and white. So, it's a yes/no answer. What I don't want is to get into a level of over-analysis in metrics that—what I would consider more old school of how many rows, or is this 95 percent clean? What I want to get into is the perception and the reality of am I delivering exactly what you need to drive change in your business unit?
Brian: You talk about how involved those customers are, I assume most your customers are—are you servicing internal Novant employees and a mix of actual end customers, like hospital care providers, or a nurse manager, or care practitioners? Can you talk about who that audience is, and how much are they involved in this process of creating these data products?
Karl: It actually runs the gamut of everyone involved across from care providers out to products that may be out at the mobile unit of it, as we start to look at delivering care to not just within brick walls, but the digital and the virtual experience. So, providing people with that information to make decisions. And the level of involvement really comes into understanding what they're trying to achieve, and sitting with them, working with them, to see if we can't get something that meets that need. And this is an evolving thing as we go through because healthcare has been very requirements-based, it's been very, I would say, SDLC driven.
And as we move towards the more agile approach of constantly iterating, and working with people to refine and to deliver, it's also taken some time for the people to get used to working in that environment. It's different on both sides of the equation because now the teams that would just throw something over the wall are actually engaged fully with the teams that are delivering it. So, it's kind of a, hey, we're behind the kitchen walls, and we're actually seeing how things are made and it's a little bit different. But they're also understanding what's possible, and I think that's a good thing because it makes the teams actually deliver more value, as opposed to trying to define things that aren't really within the scope of what they should be looking at.
Brian: Mm-hm. And do you have a point person, like on a—I don’t know if you call them, like, a project team or something—but is there a point person that kind of leads these discovery engagements and engagement with this end customer? Does that role float around? How do you structure those?
Karl: Depends on the portfolio, and we really try and have a portfolio lead who is a more senior-level person, has a level of respect within the different business units. So, they are also a technical bridge to that business experience. Now, they have to be willing to also bring in the people at the beginning of the process. They can't just say, “I'll grab you when I need you.” Being a very inclusive person is a key to driving this. And that comes to a personality of not just the listening, but also bringing in people to make sure that they're all at the table; we all understand what's driving this, and they're also the type who will follow up and make sure that things are successful, constantly. Just, it really has to be a special type of collaborative building person.
Brian: Mm-hm. Do you rely on your UX people to do this work, or not necessarily them? How do you—
Karl: Not necessarily. The UX people are involved in this, we have solution architects involved in this, and engineers, but more of the portfolio lead is someone who's pretty experienced on understanding how decisions are being made.
Brian: And do they tend to kind of privately interface with these customers, and then relay information back, or are they simply facilitating group work with these end-customers?
Karl: There's a little bit of both. Obviously, they need to beat the bushes and figure out what's going on, and also understand projects that are being driven, but they also need to make sure that they're helping to facilitate and helping to get those group meetings. It just helps everyone hear what's going on, what the success criteria is, and what's trying to be achieved. I think everyone hearing it and not just being told their part drives a lot better product in the end because we're all able to question what's going on in a healthy environment to drive a better, more honed product.
Brian: And does this process change when we introduce cognitive computing, machine learning, these other technical methods for solving problems? Are you seeing a need to change the discussions or change the approach, or is that really just another tool in the toolkit?
Karl: I mean, I think it's another tool in the toolkit and it's interesting as we get into the machine learning and the natural language processing. I have always seen that as a means of being able to reduce what I would call just stressors. So, I can't process 1000 images—the human mind—that's just too hard for me to do. I can't process 2 million records and be able to do correlative statistics inside of that. That's what machine learning and the data scientists bring, but it's also—they need to hear how people are using it. What are the decisions that are being made when they're done, and what's the context, and the flow by which a person is using the tools or the information that's being delivered? I think that's important for them to understand how models are being used because otherwise there's a miss in the actual intent of the model and sometimes that may not be a good outcome overall if they don't hear the full story.
Brian: Can you give an example of that?
Karl: So, as we've gotten into a lot of, what I say, the COVID-19 pandemic, and we see all of the models, and all of the things that are going on with this, it's not just the curve, it's not just the peak, it's not just the infections, it's really what do you do with all of the information at your disposal? Because it comes down to taking care of a person, having the right supplies in place, having the right personnel in place. There are a lot of downstream activities that people have to trigger based off of what they see in front of them. So, all of the models, all of the information, that's great, but we have to understand how it's being used, what's being used for, and then how do we want to react to that as a group? And I think that that's important because then we bring in—as we see more and more things that are tied to this—we bring an appropriate amount of other information that can be used downstream.
Brian: Are you tired of building data products, analytics solutions, or decision support applications that don't get used or are undervalued? Do customers come to you with vague requests for AI and machine learning, only to change their mind about what they want after you show them your user interface visualization or application? Hey, it's Brian here. And if you're a leader in data product management, data science, or analytics and you're tasked with creating simple, useful, and valuable data products and solutions, I've got something new for you and your staff. I've recently taken my instructor-led seminar and I've created a new self-guided video course out of that curriculum called Designing Human-Centered Data Products. If you're a self-directed learner, you'll quickly learn very applicable human-centered design techniques that you can start to use today to create useful, usable, and indispensable data products your customers will value. Each module in the curriculum also provides specific step-by-step instructions as well as user-experience strategies specific to data products that leverage machine learning and AI. This is not just another database or design thinking course. Instead, you'll be learning how to go beyond the ink to tap into what your customers really need and want in the last mile, so that you can turn those needs into actionable user interfaces and compelling user experiences that they'll actually use. To download the first module totally free, just visit designingforanalytics.com/thecourse.
Brian: Is it ever challenging to get face time with the people you need? Because I know in some organizations, especially as you move into enterprise software particularly, there can be resistance within the organization to allowing what I would call customer access to the team, the tech team, or the product team, whatever you want to call it. There's gatekeepers, or simply they're just too busy. You've got, for example, whoever's managing the ER at this time, yes, they need some dashboard, but right now they're saving lives, and so they don't really have time to provide that feedback. Do you struggle with that at all, and how do you make useful products in that context, where access to the end-user is difficult?
Karl: So, that's one of the challenges I thought I would run into here. The organization has been really, I would say, thirsty for this type of stuff. So, when you started to deliver a few things upfront and get some successes early, what happened is that people have been really involved in the next project, and the next project, and you're getting a lot of involvement, and we're getting a lot of buy-in. Now, granted, things have been extremely busy since the pandemic began in March, but people are still very willing to go above and beyond with their time in order to work together for an outcome, as long as they see an outcome that's really helping to drive benefit for that remarkable care for the patient and the providers and drive better outcomes, there's been a lot more of what I would say of commitment of their time above and beyond.
Brian: Was that always the case?
Karl: You had to show early value. So, this is one of the ones where I spent the first three months that I was here going out and spending time listening. I really went on a listening tour because I wanted to understand what the experience was before I arrived; how did things evolve to where they are? And also, what did they want? What do they need to do their job? And also, what were some of the things and suggestions that they wanted to bring to the table on using information. And it was more of a figuring out culture, and how we were going to fit inside of that, and really drive across all of the teams that existed.
Brian: Can you share some of the wish list? What were they unhappy with? What did they want? And how did you change your team, the way they would address those things?
Karl: So, when I first arrived, there were many our analytics groups, and a lot of silos of information, and not being able to piece together the whole story across the different groups and the different systems. So, a lot of the early days were patching that together, and also I'm used a couple opportunities early to force teams to work together. So, almost forced cross-functional collaboration, and taking those opportunities to—one, I took a team from all the different groups and put them in a hackathon and sent them off for a week to go solve a problem together. And surprisingly, they had not worked together, and they should have in the past.
So, by them working together, what they did is develop a quick relationship, and they developed an understanding that this was going to be a lot better way, and that little success then led to other teams talking with each other and finding opportunities where we could just do that. I'd say that cross-functional work, and then starting to look at what are the other things that we can do, and also showing a means of being able to deliver success from a visual standpoint. So, providing them tools that were just easier to collaborate on. Some gambles on my part with, kind of, providing open access tools, and changing some of the access to data. But in the end, people wanted to do the right thing, so by offering these, got a little bit more buy-in early on, and then those types of things just led to what I would say, are cascading wins.
Brian: Was there resistance from these teams, initially, to go do this type of work, or were they eager to jump in on it?
Karl: That's kind of a mixed bag. So, obviously, there are some people who are just wanting to do that, and then there are those who will fall back on the old crutch of, “Well, this is not the way that we do it. This is the way that we've done in the past.” Which, that's stuff that you hopefully want them to make a choice and not to have to force them. You'd rather them choose the carrot than have a stick, and this is one, if you show them the benefits overall, then they should be going and driving that change themselves.
Brian: Are you still—do you feel like you're still transitioning that, or that's a thing of the past at this point?
Karl: Well, it always comes and it goes. We're a tremendous amount better than we were. But you always will see fallbacks into old behaviors, especially when stress comes up. Anytime that stress—speed of having to do these things. But for the most part, teams work really well together. They talk all the time, so that’s—you don't have to force the conversations, the teams understand that those conversations between them drive better outcomes. So, for the most part, it's been a lot better, but obviously nothing's perfect. People are people.
Brian: Right, right. Is there a core product team or a product module that you sent out where there was a couple certain hats that needs to be worn? Whether it's two individuals or four, but there's distinct roles, regardless of people's job titles? Like, we always put a tech lead, a UX person, and subject matter expert, or whatever. Do you have a recipe that you follow as a baseline for a project?
Karl: There tends to be a, I would say, standard set of roles that we're looking for. But this is also one where you have people who've worn multiple hats, and for this particular need, they have more expertise in how the data is formulated and how it's used within the workflow, and they don't need to be the data engineer on this piece, just because of their level of expertise here. It's also one of the ones where it depends on the subject area within the business and the type of product that we're delivering. The teams are fairly flexible on being able to move with these things, and it's also one of the ones where people are looking to grow their skill sets, so there's a lot of opportunity for them to do so as we bring in the different tools. As we look at the Databricks, the Python, R, and different visualizations into it. People are excited to learn and to grow.
Brian: Yeah. With the advent of machine learning and artificial intelligence, there’s, on the customer end, we still hear about low engagement’s a problem, trusting what went into the model, particularly if they don't understand all the math that may be behind it. Can you talk to me about explainability of the models, being able to trust this technology? Is that a challenge? And if it's not, did you overcome something in order that it's not a challenge?
Karl: So, this is where transparency comes into play. The number one principle is, if I don't know how something is done, I'm very unlikely to trust it. And as you look at just the nature of healthcare, transparency absolutely has to be there because we want the clinicians to poke holes in it, and we want everyone to be able to trust it. So, we are very open; we are very transparent with everything that goes in it.
And it's also one of the ones where people have to not take that as criticism when people are asking questions. So, when you're being questioned on something as to, “Well, what does this mean? Well, what about X case?” It comes into, talk through it. The more that everyone buys into it, and everyone agrees on it, then that goes to trust, which translates directly to the people on the receiving end of the care. So, when everyone trusts what's going on and the transparency is there, what you find is that it starts to take off and that the teams really believe in delivering that value, and there is no, “Well, this is—I don't believe what I'm seeing.” No, there's a lot of, “No, we know what's behind it, and we actually really believe in it, and we're going to make the decisions accordingly.”
Brian: I would assume that's because they were involved early in the process of creating—
Karl: Absolutely. This is one where it’s a level of uncomfortableness if you're not used to having everyone there at the beginning. And that's where that team mentality comes in. It's not just the one person giving to the next person, and the order goes down the line. This is, that team is responsible—no matter the role—for the success of what's being delivered. And that means everyone's involved upfront. And there's a lot of trepidation when you begin a process like this because it's one of the ones where, “Just trust me. I know what I'm doing, and I'm going to go build it.” That's not—that's irrelevant. No, it's everybody together at the beginning, and that act of just working together creates a much better product.
Brian: Yeah, I think you got to get assumptions on the table. And I think—I don't know, the best designers that I know of, are able to take away the assumptions, and there's a lot of questioning going on because we don't want to bias things with our, what I call self-reflection. You want to almost assume you don't know anything about it, to try to surface all this stuff early on in this ideation process, so that you don't just go off and build what you think makes sense for yourself, based on all your past experiences because this is what builds risk into the solution, and then you find out later on, it's not resonating with somebody. So, I think that's a really important skill to have these voices at the beginning. Do you look at that?
When I teach my seminar and I talk about—the team is one of the earliest modules, and I kind of have this concept of the inner ring, and then you have the medium ring, and the outer ring, and you may only have one or two rings, or even three, it really depends on the company size, but you always start with a group. And the group may not be everybody, but it minimally is going to involve that customer, the voice of the customer, a technical person, a product, a UX type person and, obviously, someone that knows the domain, and you need to get buy-in from that small group, and it makes it easier to go out to the next ring of stakeholders. Maybe it's a secondary user who doesn't use it too much, but they're—or maybe a senior leader who's not going to use the tool, but they're blessing it, and their subordinates are using it, so they want to be involved. Do you also look at it that way where, maybe you don't start the initial project with 20 people, but there could be 20 people that matter?
Karl: No, absolutely. There's also something else that I use, and it's not necessarily starting an argument, but it's more creating a, when you're getting into what I call the 360 degree of collaboration, it’s who's going to be the VP of common sense? So, who's going to be the one who kind of disassociates themselves, and just asks what are just, kind of, ridiculous questions. But out of those ridiculous questions, what you find is people are going to have to dig deep and explain what's going on, and there's a level of having to explain things that also makes you look at what you're doing. The same reason that paired coding took off. It's, look, I have my view. Sometimes I get so caught up in my way of thinking about solving that problem, that I don't take a step back and go, “I actually missed the point.”
And that's where you got to have a team that can move quickly, can have that kind of trust, but they also need to understand by questioning, it's not a personal attack. This is part of how we learn and how we grow, and that's a different culture than people feeling like they're being attacked for questioning. One of the things I tried to bring to the table is, anytime that we're having a disagreement or a discourse around a problem, I want there to be an understanding that there are two possible outcomes: I learned something, or you learn something. But on the whole, we need to be learning, and growing, and continuing to foster that type of environment. And by doing that in the smaller context of the team, and then taking that outward, everyone within that team now has the ability to communicate out the point, the vision, and be able to handle those questions as we move forward.
Brian: Yeah. Do you use any type of design, or testing, or validation to settle those kinds of disputes about what's needed, and do people understand how to use it? Do you do any type of testing—usability testing would be a classic example, but some type of validation of the solutions?
Karl: It depends on what we're trying to deliver, obviously, because there are different things whether you're talking about a machine learning model, or you're talking about a decision support system, they may have different SLAs, they may have different criteria, but I always want to make sure—the final test is, does this solve the problems that you intended it to do? Data validation is interesting; it looks like the form that you created, but is this something that you're using to make decisions, and is this helping you to do your job without having to really what, I would say is, overexert yourself mentally?
Brian: But is there a way to make that pass/fail, like, there's a way to measure that? Is it just self—is it just kind of a reaction from them, or do you actually put them through some type of protocol to actually validate, “Yeah, it's taking two hours to answer this question, and we set up 15 minutes—”
Karl: And in the past, that's part of the non-functional requirements that I've used. So, this goes into the evolution of how we get to that point from the rigors of testing. As we get more and more comfortable on delivering things, then we'll start to get into more and more of the non-functional requirements of signing off on something. So, there's obviously a maturity that has to be built, and the end goal is to have the teams understand it and they know how to deliver on it and the non-functional requirements of it fitting within those windows of decision-making are just part of how we operate.
Brian: Yeah, I think decision-making can be such a gray—it's not often—it's not always totally binary because it may be—I made an uncomfortable decision with this [laughs] information, that I wish it had had these other things—“Did I make a decision with it? Yes, because I had nothing else.” That's not the same thing as, “I made a relatively confident decision.” Do you think it's—go for the basics is kind of the mentality across everything, or is it, “No. It's not good enough to just say I made a decision.” It has to be, “I made a confident decision, or I made the decision within a small amount of time?” Do you qualify these things in each product or project?
Karl: So, that actually depends on what it is. So, let's say that I'm looking at maximizing appointments, or schedules, what I would consider inventory. So, as I look at that, all of the information may be there, but never forget that someone has to act on it. And within the windows that they have to act, that's an opportunity lost if it takes too long to compile and be able to work with. And that's where we really want to get to, yes, all the information is there, but do the audience of who's making that decision?
And how easy is it for them to, one, make the decision and then also be able to drill into it? So, for them to be able to justify their decisions, does that have a layering of being able to stack on top of it, to not only go from the easy decisions but also be able to justify the hard ones as they go through. So, that may be part of the, just, iteration, and also the understanding and working with that person. So, if you watch how they work, if you understand what they're being asked to do, that gets easier and easier. So, it evolves over time to become a much more cohesive team decision, the more that you understand how they work, and what they're trying to achieve.
Brian: Yeah. And it takes time to go do that. I think a lot of groups don't understand, if you're not doing any type of—you know, you're not interacting with your customers, doing the ride alongs, doing research of some kind on a regular basis, you're really going to go native, and it's really easy to build the wrong thing when you have no sense of what someone does in their day-to-day job and how much—you spent the last two months working on this dashboard. They're going to spend five minutes on a Monday at 9 a.m., and that's it. And their whole life is so different than yours, and if you can't pull yourself out of that, I think it's really hard to be successful with tool-building and product-building, so I think it's really important to have those ongoing conversations on a regular basis, so you don't go native. I don’t know [laughs].
Karl: Yeah. And that also goes back to, one—let me just preface this by saying I'm not a finance person. I get it, but it's also one of, kind of, my jokes is that they see the Matrix clearly, of Excel and all the information at their disposal. And part of that is, okay, so what do we do with non-finance, non-detail oriented people? Do they see clearly the decisions that they need to make, and is it right there in front of them, and can they then drill into understanding the details? So, it's got to be a story that's told off the decision-making and that's a very different thing for some people who are very data-and detail-oriented to back themselves up to, helping that story evolve.
Brian: Yeah, yeah. Karl, this has been super helpful. Thanks for sharing your ideas, and I just curious kind of in closing, if you were picked up out of Novant Health and plopped in—you've moved, and you're in a new state and new job, and you come in to run an analytics group or data science group, and they're, generally speaking, they're answering tickets:, “Oh, you need a spreadsheet with these columns? Sure, we'll work on that and throw it over the wall.” What do you change in the first month, first six months? Like, they're not putting out stuff people want to use or can use. What do you change?
Karl: One, you need to really understand the why. You got to understand what business decisions are being made, what's driving the strategy of the people who are asking for all that information. And then finding an easier way to deliver the same stuff to them, as opposed to relying on—my bet is going to be that they have someone on the other receiving end who's doing a lot of translation on this. So, going and talking with them, understanding why they're doing what they're doing, and figuring out an easier way for them to get out of what I call the data jockey business, and really into decision-making.
Brian: Got it. Good, good advice. How can people follow your work? Are you on social media? LinkedIn? If people want to stay in touch and just, kind of, learn about your experience?
Karl: No, definitely LinkedIn. Novant Health is very active in talking about the things that we work on, whether it's the patient care, the new innovative processes, or the AI, we’re on LinkedIn. And always feel free to ping me on LinkedIn if you just want to talk about some of the things that are going on within analytics, not just in the healthcare space, but just the trends overall within the growth of the industry.
Brian: Well, Karl, it’s been—that’s—and by the way, folks, that's what I did. I was just interested in Karl's title and background and profile, and I reached out to him, and he was happy to chat with me. So again, that's sometimes all it takes is picking up the virtual phone, so to speak. So, thank you for agreeing to do that, and letting us go into this deeper conversation here on the show. So, it was great to talk to you.
Karl: Thank you for having me, Brian. Really appreciate it.
Brian: Yeah. My pleasure. Cheers.