Reasons your next sprint, product, or project might fail

Good design happens at the intersection of discovering real user needs/wants and business goals that are ACTIONABLE (by design and engineering). Yes, there's a little magic/instinct that creeps into good design too, but you can get far without a lot of this magic. It's really more about nailing the problem set, and having really clear goals. But what does that mean?
When I am talking with a prospect about a new project, I ask them tell me their business goals for the project/app/product. More often than not, I get a reply like this: 

"an elegant user interface"...
"easy to use"
"supports our data infrastructure"
"like Apple/Iphone"
"easy for engineering to build"
"a new dashboard/visualization"

This usually requires a follow-up conversation, as none of these are really business objectives at all.

They are design objectives...the desired end state of the user interface. And guess what?

They're ones that almost EVERYONE wants for their application. It's almost like saying, "we want funding for our project so we can move it forward."

#metoo !

Most of you on this list are stakeholders; PMs, execs/founders, analytics leaders, engineering leaders, and data scientists. Whether it's your role or not, demand or help create clear, design-actionable goals for the project. Without them, you're simply at higher risk to fail because ultimately, by the end, your customers, users and superiors are definitely going to be passing judgement in retrospect. Why not establish the means for "passing judgement" at the beginning and share it with the team?

Here are some other warning flags to keep in mind when creating or evaluating your project's goals: 

  • Engineering implementation requirements masquerading as business goals ("The app will use API xyz v2.0.1")
  • UI implementation detail masquerading as business goals ("will have a button that allows XYZ filtering and mapping") 
  • Little to no discussion of the goals prior to execution work commencing (whether it be engineering, data science, design, etc.)
  • Lack of a UX person or knowledgeable facilitator who can facilitate the maturation of the goals from vague-->design actionable.  (No offense to resident UX folks, but as a general rule, I find that internal resources aren't usually going to push as hard to get the goals stated clearly. Why? They usually feel like they have less skin in the game.)
  • Because we use "agile," it's less important to establish design-actionable objectives

You can always adjust the objectives as you move forward and learn, but when you get the core ideas defined up front, EVERYBODY involved wins and EVERYBODY can usually come together on course corrections.

And one final note: you can still have goals, even with projects such as machine learning where you may not know ahead of time what the outcome is. A business goal can still be stated for this, e.g. "Perform a minimum-effort lab-style experiment to determine a set of future business problems/opportunities [along the lines of X, Y, and Z] that may be possible to satisfy using [data set x]." In this case, our outcome is not a solution, but a set of possible future problems. The point is, we tried to make it CLEAR and actionable to the stakeholders, and the team, from the start. 

Dashboard Design: Is Maximum Simplicity the Goal?

Should maximum simplicity dictate success?

We all love usability these days right? "User experience is important."

Of course it is!

But, it doesn't mean that every user you show the design to is going to, or should immediately be able to, understand fully what you're showing them.


Most valuable things in life take a little investment/effort. Your dashboard can probably push boundaries and comfort zones a little bit as well, especially if it's a service that will have somewhat repetitive use by your users.

Before I continue, I will caveat that this advice is for largely operational dashboards, used somewhat regularly, and for people looking to take action on insights.

I'm not advocating for sloppy work, or making your dashboard needlessly complex. However, don't necessarily design with a goal of making the first impression as easy as possible. There is only "1" first time UX; but there will be many more "Nth" time uses of your service, right? As such, if you're designing primarily with the "Nth" visit in min, don't necessarily expect that users will understand everything on the first viewing. It is possible to design a high quality dashboard that doesn't necessarily get great user feedback on the first viewing.

Also, if the user hasn't put in a little effort to understand the value of the data/information you're showing, and you've done a good job presenting it as elegantly as possible, then don't necessarily water the design down to accommodate the newbies. It may be ok as-is, and you can set up your test plan/protocol to validate whether you've pushed it too far, or done a good job of providing rich information density that is relevant.

For many of my clients, their assumed path to dashboard simplification is "remove stuff" and use "drill down" to provide detail. Don't get me wrong; I love using subtractive techniques over additive ones when it's the best choice, but there are times when you need to add data to increase the value and insights. The drill down may not have the "summary" information that is helpful in knowing whether you should even *bother* to drill down.

As a product or analytics leader, your job is to find the balance between too much/too complex, and too simple or too low in information density. There is no single clear cut definition of what too much / not enough looks like, but don't just focus on concern around the design being "too complex." You might be making it "too simple." It's possible to design a product/service that is super easy to use, but with very low value or utility.

Maximum value, on the other hand, is obtained when an analytics solutions has the trifecta of utility, usability, and beauty/desirability. It is hard though, and usually quite expensive, to get a service to that level without intentionally designing it to be that way.

Video Sample: O’Reilly Strata Conf

This is recording of my presentation at the O'Reilly Strata Data Conference in New York City in 2017.

Do you spend a lot of time explaining your data analytics product to your customers? Is your UI/UX or navigation overly complex? Are sales suffering due to complexity, or worse, are customers not using your product? Your design may be the problem.

My little secret? You don't have to be a trained designer to recognize design and UX problems in your data product or analytics service, and start correcting them today.

Want the Slides?

Download the free self-assessment guide that takes my slide deck principles and puts them into an actionable set of tips you can begin applying today.

UI Review: Next Big Sound (Music Analytics) – Part 1

Today I got an interesting anomaly email from a service I use called Next Big Sound. Actually, I don't use the service too much, but it crosses two of my interests: music and analytics.

Next Big Sound aggregates music playback data from various music providers (Spotify, Pandora, etc) and also, apparently, tries to correlate changes in music plays with social media events happening in the real world (probably so you can see if given Event X generates a change in Plays). In addition to design consulting, in my "other life," I lead and perform in a dual-ensemble called Mr. Ho's Orchestrotica which has a few albums out that are available on Pandora.

Pandora is one of the available data sources that Next Big Sound monitors. In this email, Next Big Sound apparently detected an abnormal increase in plays on Pandora and alerted me of this via this email. Check it out below:


On first glance, I like the supporting evidence here (chart) and the lead-in text tells me thye are tracking what "normal" plays are such that they can alert me on abnormal increases. At first, I wondered why they were showing me some of my own social media posts, but then I realized they were trying to help me correlate whether any social media activity may have corresponded with the increase in plays. This is a great example of where they know their software probably cannot draw a literal causation relationship, but they can help users correlate and potentially find a causation. Incidentally, I actually don't care much about how many playbacks the Orchestrotica gets on streaming services as it's not a KPI for my group, but I found this a nice way to help artists and labels–especially artists working more in the pop/entertainment world–to understand what is going on with fans of their music, what is getting traction, etc. In this case, there is no correlation here; the social posts from my personal/artist social media accounts had nothing to do with Orchestrotica activities for the most part, but I still liked the UX.

So, what do you think "Create a Graph" does?

I wondered too. Stay tuned for Part 2 to find out.

Does your app/product provide any anomaly notifications like this? I would love to hear about it. Can you send me an example? Email it to

Here are (5) dashboard design suggestions you can start to apply today

If you're in the analytics space, then you almost certainly have at least one "dashboard" customers use.  I generally define dashboards as the landing page for your product when people log in, and so keep that in mind as you browse today's design suggestions:

  1. Your dashboard is probably too low on information density. 
    Dashboards with just a few large graphics/charts/KPIs are often a sign of low information density. Most of the time, a design looks misleadingly clean and simple because we're evaluating it on the surface. A huge donut chart with 3-5 values, and no calls to action, and no comparisons, may not be helping your customers understand the value or insight behind the analytic. Chances are, your design needs additional comparison data in order for the primary analytics to be meaningful. One notable exception here is something like an "always-on" dashboard; the kind of UI that is projected up on a monitor in a room for all to see (in theory). (Although...when was the last time you saw actually look at one of those omnipresent dashboards that's always up on the monitor?)  These types of dashboards are best left for another lesson.
  2. Most charts and data graphics can be shrunk and retain the same information density. 
    As a general rule, you can, and probably should be, shrinking your charts and data graphics down to the point where they are still legible. This allows more space for comparison information to be added, or for neighboring components to be visible within the user's viewport (the visible area of a given page or screen, as restrained by the device's resolution/size).
  3. A good dashboard usually will promote information that requires user attention, is relevantly curious or insightful, and facilitates completion of frequently repeated tasks. 
    Instead of thinking about the dashboard as a dumping ground to display the latest values for each of your widgets, consider modeling the design around "What would drive somebody to come back here? What new information can we surface here that can help them? Did we provide links/buttons/affordances to drive people to the things they come to the app/site/product to do on a regular basis?"
  4. Consider usage frequency within your design.  
    One of the easiest things you can do to determine how dense, insightful, and rich your dashboard can be (without overwhelming the customer) is to understand the dashboard's usage frequency. In general, the more routinely your dashboard is used, the greater the information density you can provide (and probably should). Alternatively, if users are only peeking at it monthly/quarterly, you probably need to balance both information density, and how much you can assume about the users' "given knowledge" when they're taking in the latest data. If you have particularly technical information, a dashboard that supports infrequent visits may need reminders about terminology, accepted/normal ranges of important KPIs, etc. A simple example of this would be something like a credit score: if you can count on the customers knowing what a credit score is, and they understand the qualitative ranges, then you might be able to get away with just showing the current score number (e.g. "790"). However, if you know the design will be used infrequently, then displaying qualitative ranges next to the quantitative values helps make the design more useful (i.e. "790 - Very Good").
  5. Consider emailing your dashboard.
    ...but heed this advice carefully!  First, if you already have a killer dashboard, then it may make sense to routinely email some or all of the dashboard information to users instead of asking them to log in. Turn that dashboard into a report card your customers can rely on. Additionally, if you have actionable insights in the email, then let users link directly to the place that needs their attention, even if that means deep-linking into your product and bypassing the current dashboard screen displayed in the browser/software by default. On the other hand, if your dashboard design is lacking or immature, then don't waste the time and resources building an email version of it hoping to see better results. Designing and coding rich format emails that render your design intent properly is still very taxing, and it is even more complicated when it comes to UIs that involve charts and data graphics. Wait until you have a great design/UX before investing in email delivery, or consider altering the design you'll deliver via email. Sending out a low-value dashboard every month just reminds customers to question why they're in business with you in the first place.

A Visual Introduction to Probability and Statistics

If you're working on data or analytics products, then you'll want to check this out whether it's a refresher, or something to share with your team:

Seeing Theory is a great visual introduction to probability and statistics from Brown University created specifically by Daniel Kunin. Enjoy!


Here’s a fast way to evaluate the utility of your dashboard design

Here's a super easy thing you can do today to evaluate your data product's dashboard. If you're displaying quantitative data of any sort, especially trends, then this will probably help you come up with opportunities to improve your design. Of course, testing the design with real users is always the best way to evaluate your product! With that, let's jump in.

It's quite easy, and I borrowed the technique (from Tufte):

For any and all your key UI widgets that sum up conclusion data, read the data aloud and then ask yourself " compared to what?"

If you cannot answer that question easily without leaving the dashboard or report, then you know you probably have room for improvement.

If you're going to tell the customer in a donut chart that the distribution of (3) values over some time period was "3,17, and 80" then the question is, "as compared to what?"

Keep digging further:

  • Do I need to know what the previous values were? Over what period?
  • How likely is the customer to know these values as given knowledge? (e.g. I bet you know what your typical home temperature is, but do you know what the barometric pressure at home typically is? Don't assume one design pattern always work for all the data points.)
  • Is the absolute value of the data interesting, or is the change (delta) in these values what is interesting?
  • Could the data be presented in a qualitative way (e.g. "3 = great, 17= so-so")
  • Do I have to read or view a lot of ink to figure out "hey, there's really not much new here to look at from last month/week/etc."

If you're still stuck once you've asked this question against your data, here are some ideas you can use to inspire your design. Try comparing your metrics to:

  • My average, min, or max
  • Team/group/industry/competitor average/min/max/movement
  • My change since last period
  • My typical deviation / pattern
  • My business's cycles
  • A unique benchmark in your company, product, or the industry
  • An index you created
  • A SMALL, relatable unit people can grasp. In other words, showing $26,981,230.12 might be the real number, but printing $26.98M is easier to read.
  • Even better, show that $27M as something like "2,000x the average value and 45x the #2 earner" puts that huge number into a relatable context.

Now your turn: what useful comparison did I leave out that you've found useful to customers?

What happens when you use templates for dashboards and analytics products?

I hear it all the time on Quora, in real life, and from clients:

"What BI tool should we use to visualize our data? Is there a good dashboard template you know of?"

When it comes to designing analytics products and dashboards, templates and libraries aren’t necessarily bad, if you have spent the time to ensure the template designs actually satisfy the tasks and goals users have for your product. Most people don’t take the time to do this; they start with their data, and try to pipe it into the provided design, assuming that fancy charts and tables will take care of the UX. They *look* great out of the box, and they do feed you a quick hit.

It's kinda like that first free dose from a dealer. But, then the high goes away after you aren't sure what to do with those numbers.

"Ok, so we sold 984 last month, and 876 the previous month." Now what?

Learn more about templates and basing your design on a competitor

Maybe your product’s problem is information *underload*

We've all heard about information overload, and the paradox of choice.

Don't you love those thai menus with every type of sauce, noodle, and protein, all written out as separate dishes? "I'll have item D132 with no water, I mean the one with chicken on page 12....yeah, that one."

There's a ton of data on that menu, but not a lot of information about what tastes good.

Over the years, I've worked on some very data-intense software applications, with the IT software for data center hardware (storage arrays, networking, computing hardware) probably being the most data-rich.  Many of the management interfaces I helped redesign started out as "where should we display all the metrics we're collecting on all these objects the software is managing?" Tons of index/list screens, with the occasional hub/spoke information architecture breaking up massive numbers of metrics into categories. A step better, but much of it was raw data desperate to be informative to somebody.

Ironically, some of the dashboards for these products tended to be information deserts. A few pie charts, desperately trying to convey an overall score of the entire system, but failing miserably. Or worse, a list of "every resource that is yellow or red" sorted in some table, trying to be a to-do list for the poor IT support staff that had to manage one of these resources. Ouch.

Ironically, the issue with both situations was information underload, not overload.  You can learn more about assessing information overload (underload?) in your design with my free guide.