The (5) big reasons AI/ML and analytics product leaders invest in UI/UX design help

Summary:

  • The core (5) reasons enterprise product management and software company founders approach me for help with their analytics SAAS, ML/AI application, or enterprise data product
  • How to identify the slow-creep of UI/UX problems in an enterprise data product—especially if you're background is in data science and analytics and you've not worked with professional designers before
  • (20) symptoms related to UI/UX design that, left untreated, may lead to the five core problems listed earlier
  • How to think about these issues as a product leader and when to deal with them
  • Where this rediculous picture came from and why it's relevant

One of the problems with "bad design" is that some of it we can see, and some of it we can't see.

Unless you know what to look for.

Over many years, when it comes to designing UI/UXs for enterprise applications that are heavily based on ML and/or traditional analytics, there are frequently a variety of "red flags" that tell me early on where a design may be falling short of its intended goals.

When the design/UX suffers, the users pay for that—and when users have to incur wasted time, friction, confusion, or trust issues, this results in negative business impacts that stakeholders can feel. So, if you're in a product management, or business leadership role for a commercial enterprise software company that relies on ML/AI or analytics to deliver value to customers, this one's for you!

Even if you primarily focus on delivering internal data products and solutions, many of these problems should be things you should be aware of too—because even if your users don't swipe a credit card or sign a contract to use your data product(s), the stakeholders funding your work will be expecting an ROI for their investment. They might say "what's our AI strategy?" or "We need a dashboard that uses AI to predict X" --> but buried in that ask is an actual business problem they're trying to solve by telling you what the solution is. It's a friendly lie, and you'd do good for yourself, your team, and those stakeholders by getting in front of these UI/UX issues below.

Let's jump in.

The most common business problems that lead to data product leaders and founders like you calling me for design help with their SAAS analytics and enterprise ML applications are:

  1. Sales/demos seem too hard or too long to close despite the fact the team feels like the product's value should be significant to the customer. Often this is observable through customer questions, muted interest from prospects, and no customer/user interest in changing the status quo.
  2. There is a spike in customer churn / low adoption due to a competitor product with better UX…or a significant customer renewal is on the horizon that is seriously at risk of not renewing. This can be a "slow burn" problem, or a whale customer is threatening to not renew their contract, putting significant renewal revenue at risk. The product team "got comfortable" and the lack of investment in good UX is now a major risk for the product as customers look at alternatives or demand easier solutions.
  3. The product's IP is technically solid, but customers/users cannot see how this IP translates to value for them, without a lot of explanation, training, or time investment. If the product is B2B, and a key champion inside the customer org leaves, there is huge risk to the vendor's business.
  4. Dashboard/UI bloat => customer frustration => fiscal impact:  what used to be a relatively small and simple product has grown more and more complicated to use as the number of views, interactions, features, and use cases it attempts to solve grows. The head of product/business stakeholder has gotten that "we need a redesign" itch, but isn't quite sure where to start.
  5. The product manager/founder/stakeholder's vision/strategy for the data product is simply not manifesting itself in the UI/UX that customers are asked to use. They gave the vision/strategy to a technical team, but the tech team hasn't delivered a solution that the business feels confident bringing to market. "It could/should be better" but they need a fresh vision for what that looks like.

Now, these are all fairly high-level problems.

Let's now look at how some of these problems manifest themselves down at the UI/UX and product strategy layer—so you can begin to see poor product strategy and UI/UX design choices and get ahead of them before they grow out of control.

The issues below ultimately surface themselves in a UI/UX, but the causes can be many from people and process to communication, a lack of leadership, inexperience, and the like. As such, not everything here will be a "UI problem" per se, but that's often where a problem surfaces itself in the end.

Finally, before I give you (20) symptoms I've seen repeated in client design engagements, don't take this list personally—if you're a product leader, you've got a difficult job to begin with—little power, maybe few resources, but tons of responsibility to deliver value.

Design is usually not top of mind for an enterprise software product, let alone one in the ML/analytics space. So, you have to decide "is address this worth our time and resources?" Whether you address these yourself with training, outside consulting help, or internal resources (if you're lucky to have them!), you first need to recognize some the symptoms that may tell you it's time to invest more in design. 

So, with that, let me tell you some of the symptoms I see in client engagements that should be triggers for you:

  1. The product's bloat and growth has become extremely complicated as it tries to satisfies multiple users with differing needs. The most classic one is the "operational user" vs the "downstream stakeholder" with a SAAS analytics solution, for example. You want the solution to be "really easy for a non-technical user" but at the same time, give a ton of flexibility to the more technical users who "want access to the data...especially export-to-excel!"
  2. There are no clear, quantifiable progress or success metrics for current product work that is going on. Nobody knows how to measure progress except "ship thing by date X." Don't look up—there's no time, and they need it yesterday.
  3. You offer dashboard/product customization as a crutch for not making hard decisions about what the product/design should be. It sounds better to everyone to offer it, and it'll be hard to remove it later, but secretly, you know that it's an escape from making some hard decisions about "who is this for? and not for?"
  4. Your product's data viz choices might be correct in theory, but despite picking the right chart types, the choices you put into the product aren't delighting customers and generating the value you hoped for. The UI is right, but the UX isn't—and the lack of adoption and delight is telling you something is off.
  5. There is team agreement that the current design is not good, but it's unclear what a quality redesign would look like. The engineers were given no time to design the solution properly, so they picked a GUI library / viz tool, and threw up their best attempt at charting/data visualizations/UI in the time allotted. You, the product owner, and your technical teams know something is wrong—but they aren't sure how to fix it "properly" without guessing again, or simply copying a UI/UX they've seen elsewhere and praying it works in your product.
  6. There is a rush to throw AI (GenAI, now, of course!) into the product wherever possible, leading to the "AI Button" phenomenon. "Click here to do X" and now, it's "Click here to do X ....with Holly, our GenAI assistant!"
    1. On this topic, GenAI solutions, while initially feeling like magic, in reality often require significant effortfrom users—the fact that "prompt engineering" is a thing, and there are guides showing "how to get the GenAI to do what you want" is a design problem—but to a tech team, it feels like a data science problem first. 
  7. You can't provide me with a specific, short list of "benchmark use cases" by which the quality of your product should/is constantly measured to ensure the product stays on course with customer needs. Furthermore, if it involves predictive models, because there is a "range of responses" possible, beyond recall vs. precision, you're not sure how to evaluate the UX of these use  cases.
  8. Your sales org is effectively running the product org by telling them what to build so they can close deals faster at all costs—after all, more sales is the current biz mandate. "Gotta show revenue of $X by Y date." End users who aren't also fiscal buyers don't stand a chance—unless you stand for them and champion the long game.
  9. Your product/eng team is working backwards from data/analytics/IP they have, to create "solutions" that they think should/could be useful to users. However, they cannot actually describe what it's like (day in the life) of the target user, and how the data/charts/visuals would naturally fit into the customer's life such that they would change behavior to adopt your solution. Their hearts and minds are in the right place—no question. However, it's simply not a reliable method for delivering value repeatedly with lower risk.
  10. The fiscal buyer of a solution is not representative of the end user who'll use the analytics for decision making. The staff/users have no say in the solution, but the solution promises financial ROI to the fiscal buyer that sounds logical and looks obvious when the tool is "demo'd" to them. The reality? The "head of operations" you sold into doesn't know what a day in the life of a say, an ad campaign manager, actually is. A year down the road, the fiscal buyer has perhaps moved on, nobody in campaign management is using your product, and everyone at your customer's company is wondering "why are we paying for this?"
  11. There is a relentless focus on delivery speed at all costs—but nobody can define what quality means beyond the technical parameters (data governance practices, privacy, SLAs, speed, etc). Aside: this is also one of my biggest beefs with many "data product definitions" in the internal enterprise context. Most of them focus way too much on technical benefits (like "scalable" "reusable" "data contracts") and other aspects that are all secondary benefits to a customer/user. A customer won't buy something or change the status quo until a primaryneed is satisfied—and this is largely absent from many data product definitions!
  12. The team cannot produce a colorful, descriptive example of their core users on the spot and how the data in the product fits into their work life. This usually means that there is little to no ongoing user research going on, nor assets that could be shared with other team members (video recordings of user interviews, transcripts of customer conversations, highlight reels, usability testing results tied to specific outcomes, etc.)  
  13. You or your staff cannot get routine, direct access to end users and so the UI/UX is mostly based on guessing and proxies telling you what somebody else wants or needs. Ironically, you're supposed to be delivering an insights product rooted in "facts and data" but when it comes to design, there's not a whole lot of data and facts informing your choices.
  14. Your or your stakeholders think that training and change management should be addressing product usability and utility issues. I think you know where I'm going with that.
  15. Your product's visual design looks like it was designed by engineers....or an engineering POC—but that was ok because "enterprise customers—especially analytics users—don't care so much about how it looks." At the same time, you want the solution to feel like "Apple designed it" because that means "simple, high-value, and useful." You think how it looks shouldn't matter that much to that finance user, but secretly, you know it does....you're just not sure how much.
  16. You did invest in visual design—but it didn't work out so well the (first?) (last?) time you tried it. You liked how the new version looked "cleaner," but somehow, your goals around adoption, retention, usability, and utility didn't seem affected by any of that investment. Few questions were asked by the designer about your data, insights, the limitations of the data, and other factors that you felt would have to be understood to make any improvement. Now, you're worried, and your data science and engineering team now has a lot of questions about the feasibility of the design, and the schedule, and you're wondering—is implementing this redesign really going to pay off?
  17. Your AI/ML solution demos well, but end users aren't finding it quite so usable, explainable, or trustworthy. You started to expose more knobs and controls so they could customize the outputs to their needs, but now, what was supposed to be a simpler, ML-driven approach to your product now seems like it's getting more and more....technical.
  18. The UI surfaces an underlying data model / object model in the application navigation. Got a major entity called "customer" in your data model? Of course there should be a Customer tab in the UI too, right? And of course users should be able to drill down on that...right? (Analytics solutions love to afford users with "drill-down-for-detail"!)
  19. The UI - especially key dashboards - are based on a template, product, portfolio piece or something else you or your team copied. However, when asked "how did you arrive at this design?" The answer is effectively "filling in the template" with your data. At the time, it made sense—but now you're seeing that somehow, copying what you found in Google image search isn't working. Ever see one of those "3-donuts-across-the-top" UIs? Oh yes you have—and that's no judgement against my favorite breakfast item (it's not dessert, damnit!) The developers hate one-offs, and so they're telling you that "we need to be consistent" and reuse components, styles, and patterns wherever possible—but somehow, end users aren't so happy despite the donut party you've thrown for them.
  20. You or your stakeholders have a high expectation for good design and user experience....but you've underinvested in the resources that can deliver that. Why? Maybe you got burned in the past by a firm or a staff designer—after all, that portfolio looked good on the surface, but your product doesn't seem any better off in hindsight. You got back UIs with fresh paint...but few questions from the designers about constraints, the data you're working with, use cases, or engineering/data science considerations. Now your engineers have a ton of questions about the feasibility and lift required to build the "new version"—and you're not even sure it's much better (beyond the paint job). The engineers/data team thought that getting outside UI/UX design help would make things easier, not harder—and a part of you got really worried when the conversation turned to fonts, colors, and look-and-feel so soon.

Are there more? Yes. Is today the day for them? No! 😉

Look, you wouldn't be on this list if you weren't committed to beginning to make some changes.  Keep your head up—even if you were nodding your head along to many of these 20 symptoms I just described. If you're not quite sure how to address these issues, or it seems daunting to deal with these on top of everything else, I might be able to help. Learn how to work with me directly and set up a free discovery call.

Oh, and by the way—the prompt that generated that image at the top of the article?

"Bad dashboard UX." Courtesy of Google's ImageFX.

Exactly what you expected, right?

Array
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

More Free Insights: