Discount launch pricing for the Data Product Leadership Community ends Sep. 30!
Participate in our live monthly Zoom webinars (recorded for on-demand access), connect over a 24/7 Slack, and get introduced to a new peer 1x1 every 2 weeks. Join my free Insights mailing list to get subscriber-only pricing before Oct. 1.
Get the coupon then apply for membership

Metrics toilet? Or indispensable data product?

Today, I want to share a story with you about how human-centered design helped a very technical product team turn their analytical monitoring app into an indispensable decision support tool, largely (as I recall) without adding or introducing any new data into the infrastructure.

Instead, they changed the experience.

They were able to:

  • Decrease the tool time requirements their data product "asked" users to put in to get value out
  • Make the value of their software and data analytics immediately self-evident (less customer time investment)
  • Simplify the experience
  • Convince an enterprise end user—who had no idea why his company was paying for this product—realize the value of it simply through doing research

So, what happened?

A software architect had called me in to fix a large data visualization they had in the product that was supposed to help customers detect problems within an a monitored ecosystem of things. They would often sell the product in part based on showing how their product could "see" and monitor all these things, which led customers to believe that the tool was "keeping an eye on everything" for them and would alert them when something went wrong.

Well it did just that.

Except, you had to open this topology visualization, which could have 1000s of items in it, just to see what was wrong.

And you know what?

That didn't help them understand why something was wrong with a given object, what caused it, or what to do about it.

The experience was wrong.

Users did not need to see the entire ecosystem to deal with a problem. Instead, they needed the analytics to provide actionable decision support:

  • What is wrong?
  • When did it happen?
  • What's the severity to our business/ecosystem?
  • What's the remedy?

Customers didn't need to see the entire or interact with the ecosystem, but they were forced to in the original design of the product . From there, the UX was "view this thing, view every stat about it we have about it, then view the next related thing, and every stat, and repeat." The tool did not provide any useful conclusions aligned with the direct work of the customer.

A giant visualization helped "sell" the product, but it had nothing to do with how a real end user (somebody working in an enterprise data center) did troubleshooting work—which was a core value proposition of the product.

So what did the client do?

After doing some research with real customers, we learned how these customers actually did monitoring and troubleshooting. What evidence did they need? When did they need it? How much supporting evidence was "good enough"?

When we applied the CED Framework, the client eventually ended up with a solution that significantly reduced the number of screens, charts, and data that had to be consumed by the end customer. This translated into immediate business value: this monitoring application was now a useful decision support ally to the end user. It provided a clear story ("what wrong, and how long has it been a problem?") and remedy steps all packaged up into a single dashboard, tightly designed emails, and a few supporting screens.

How did we know?

We tested it with users.

The sales people and CEO? They wanted me to test it with their design partners and friendlies. While testing with any valid end user is great, I wanted them to test it with their most dissatisfied customers. One of their accounts was unlikely to re-up at the end of their license, and so they let our team engage with the customer. What harm could come right?

Well, during our test of the new design, the customer perceived us to be "training them," even though we weren't. We were just doing speak-aloud usability testing, with some relevant tasks that this customer could relate to. As the customer worked through the sample tasks we gave them with the new design, they started to see the value immediately. "I wish somebody would have given me this training when I came on board!" Now, an interesting design consideration to consider here is that your enterprise customer many have staff turnover. So, if you're selling your data product to a business, you have a much better chance of retaining the "next" employee to encounter your service in their company if the UX is simple, easy, and shows value early and often. If they open up your data product and end up saying, "uh, what is all this? what's this for and why are we paying for this?," then don't be surprised if you lose that account or it just ends up in a heap with the other data reports and things that people can't make sense of.

​You can have all the right data but if people can't make sense of it to make useful decisions in the context of their work or play, then it doesn't matter.

If you want to avoid their first mistake and learn how to produce a data product or solution that is engaging from the start, there's a couple ways: my mailing listmy training seminars, or learn how you can work with me one on one.

 

Photo by David Kovalenko on Unsplash

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

More Free Insights: