Does your data product enable surgery, or healing?

I recently started playing percussion in a new Celtic ensemble in Boston called Ishna, and we were recently invited to be a guest artist with Symphony NH (New Hampshire). After our concerts concluded, the executive director invited Ishna to a dinner with some of the symphony staff and  board members. This is pretty typical: board members of performing arts organizations often like to meet the guest artists, and at the party, I sat with a pediatric orthopedic surgeon and his wife.

So, what does this have to do with analytics and data products?

Bear with me 😉

When my wife asked the surgeon more about what he does for a living, he replied by saying that, "while yes, I am a surgeon, what I often end up doing is trying to convince families of my recommendations, especially when the best solution is to do no surgery and let the body heal naturally. Time and patience, not surgery is often the best course for children, but that's not always what they want to hear."

So then I said, "Maybe you should change the sign on your door or tell parents that you're a healer of children's bone and joint problems. You can then tell them that the tactics for healing sometimes vary, ranging from patiently doing nothing up to surgery in the worst case."

Looking a little perplexed, he then said, “That's it! I've never heard somebody explain it like that, but that's exactly what I do!"

I then added, "Surgery is a tactic that may lead to a fix...or not. Healing is an OUTCOME. A result."

Maybe he can explain it that way to the next patient/family.

Back to you.

When you look at the data product or custom decision support tool you are developing, are you keeping in mind outcomes while you’re engaged in the tactics ( model choice, data wrangling, feature development, integrations, UI design, engineering, etc?)

Data products may be required to generate specific results (e.g. predictions, prescriptions, root cause analysis, answering specific questions), or provide a platform for "self-discovery" in the data. Ultimately though, "analytics" aren't really an outcome and the final interface, UI, or report isn’t the “end.” A fully designed user experience seeks to ensure that a downstream user and business outcome is actually realized. Design considers both the business objective, and the human involvement required to lead to the outcome.

As an example, let’s say you're working in an internal analytics capacity, and have been tasked with creating a model that predicts customer attrition. In the end, if nobody does anything downstream to retain those customers, then you provided a model or analytics (i.e. tactics), but you did not deliver outcomes ("prevent/reduce the attrition.") A good product and service design experience would look at the entire workflow, and all parties involved (even if they're not in the data or engineering team), to maximize the chance that people acted on the model's results and the outcome of reduced attrition was achieved. This might also mean you have to involve several departments in the design process, such that the entire end to end process is understood. I understand that maybe your product/staff/service is not directly responsible for retention of customers who make it onto the model’s generated “watchlist.” That's fine, but ultimately, somebody needs to constantly be asking whether we’re doing data science and software engineering, or are we helping to stop customer attrition? This is important because the software alone may not be enough to achieve the reduction in attrition.

Fortunately, this is a place where the practice of design and UX can help. In fact, some in our industry call this form of design, "service design," as it takes the principles of design and applies them to a problem that may be larger in scope than a single software application, report, or artifact.

As many of you probably know from being on my list, data product design is not just about data visualization and visuals. A design strategy looks well beyond user interfaces to help define what is required to enable the desired user and business outcomes.

  • Design gets the stakeholders and team aligned around specific business outcomes that are desired (in this case, "reduce attrition by X over Y time [etc.]”). Don't overlook this: in many cases, goals are clouded in tactics, such as “we need a design/report/UI for a predictive model we are building to predict attrition." The UI design is an activity and tactic; a report is an asset/output. Neither is an outcome, and simply putting a nice interface on top of your data science investment may not be sufficient to achieve the desired outcome. We keep hearing about “getting a model in production,” but this is also not really the "end point" of the project. The right people have to take action on the model's insight or it doesn't matter that the model is in production.
  • Design maps all the assumed touchpoint and roles/depts/people involved, such that the business understands everything that might be required to allow this new tool to actually influence/generate the desired business outcome.
    • Example: if there is already a retention process / SOP in place inside the department responsible for this, and your model is allowing that team to be "proactive" for the first time (e.g. "call these customers before it is too late"), then a successfully designed experience would ensure that your decision support tool is modeled around how this team already does, or wants to do, its retention work. Unless your software introduces a brand new business procedure, you should not expect your users or customers to conform to your design/tool or to accept it at face value. The tool should sit quietly within your customers' workflows and SOPs as much as possible. Don't disrupt; just be a super handy, useful, beautiful helper along the journey. Most of all, realize that you cannot just "look" at your tool and pass judgement on whether it meets these criteria. You have to get out there and evaluate the software with real people by conducting research such as a usability test. If you don’t know what their journey currently looks like, it’s super hard to design a service that falls neatly into their current workflow.
  • Design looks beyond the software/model/analytics to identify other obstacles that may be preventing attrition reduction. Examples:
    • Org problems such as no/poor communication between departments,
    • Lack of incentives (the retention team doesn't care about your model results)
    • Poor user experience between apps/services/data (e.g. your model predictions are not surfaced in a useful tool at the right place at the right time with little to no effort).
      • Example: Perhaps the retention team has a software app they use all day long (e.g. CRM?) but your model predictions were prototyped in some UI that lives on a cryptic intranet address nobody can remember, with a special login credential they don't have (and no reset-password button or method to contact any human responsible for building the tool). Come on: we've all been through this before, right!?
    • Poor staffing/resourcing (e.g. somebody allocated budget for the data science and product/software creation, but they did not factor in what changes might be required to get the sales/retention team on board with putting the software's predictions into practice), etc. If nobody uses the service, then it didn’t facilitate the outcome.
      • Example: let's assume we have produced a predictive model for attrition: who is going to take the model’s report/UI/analysis and do something with it? If another department is in charge of "retention," do they know about this project? What are the possible tool, organization, time, or incentive problems that might prevent these downstream users from using your model/product/software application?  Can we involve the retention team in the design process, to increase their engagement with the produced software/report and help them understand how it will help them do their job or achieve the desired company goal?
  • Design helps identify possible model improvements simply by having all the stakeholders and departments involved in the design process. For example, because the retention team was included in our example project above, they may help directly or passively to inform what improvements could be made to the software and the model in the future. They might also identify a need or desire to be able to “leave feedback,” especially if early on, the model isn’t as accurate as was hoped. By interviewing and including them in the design process, their feedback, while qualitative, can provide valuable insights that help inform future software iterations.
  • Design creates champions for the data science/product team: by making the retention team a key stakeholder in the process (“we’re here to make your job on the front line with customers easier”), you’re more likely to get their involvement in the future. I might also add that it can be refreshing, and FUN. It’s fun to get into other people’s worlds and see what their job is like, and how you can help them: even if you perhaps know nothing about “retention” of an unhappy customer.
  • Design saves money and time. Why? In our example above, you may not need more than a prototype, even a low-fidelity non-software prototype, to “exercise” the end the end process. A list of actual customers isn’t required to figure out how the retention team wants or needs to interact with the list. By having an agile, MVP-oriented approach, you can avoid becoming one of the 75%+ of big data projects that fail. You might even save money by identifying to your executive team that “through our design and prototyping work to date, we think this data science project may fail: not because of technical reasons, but because we don’t think we’re going to get the right people to engage/change their behavior at this time.

Personally, I’d rather be a healer than a surgeon, and while hopefully this isn’t in my future any time soon, if and when the time comes, I would much rather go to somebody who sees themselves as a healer first, and a surgeon second.

Wouldn’t you?

I hope you’ll go forth and design with outcomes and healing in mind first. If you need some help using design to generate your better business outcomes with your software, you can set up a free call with me on my website.

More Free Insights:

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

More Free Insights: