Dashboard Design: Is Maximum Simplicity the Goal?

Should maximum simplicity dictate success?

We all love usability these days right? "User experience is important."

Of course it is!

But, it doesn't mean that every user you show the design to is going to, or should immediately be able to, understand fully what you're showing them.

Why?

Most valuable things in life take a little investment/effort. Your dashboard can probably push boundaries and comfort zones a little bit as well, especially if it's a service that will have somewhat repetitive use by your users.

Before I continue, I will caveat that this advice is for largely operational dashboards, used somewhat regularly, and for people looking to take action on insights.

I'm not advocating for sloppy work, or making your dashboard needlessly complex. However, don't necessarily design with a goal of making the first impression as easy as possible. There is only "1" first time UX; but there will be many more "Nth" time uses of your service, right? As such, if you're designing primarily with the "Nth" visit in min, don't necessarily expect that users will understand everything on the first viewing. It is possible to design a high quality dashboard that doesn't necessarily get great user feedback on the first viewing.

Also, if the user hasn't put in a little effort to understand the value of the data/information you're showing, and you've done a good job presenting it as elegantly as possible, then don't necessarily water the design down to accommodate the newbies. It may be ok as-is, and you can set up your test plan/protocol to validate whether you've pushed it too far, or done a good job of providing rich information density that is relevant.

For many of my clients, their assumed path to dashboard simplification is "remove stuff" and use "drill down" to provide detail. Don't get me wrong; I love using subtractive techniques over additive ones when it's the best choice, but there are times when you need to add data to increase the value and insights. The drill down may not have the "summary" information that is helpful in knowing whether you should even *bother* to drill down.

As a product or analytics leader, your job is to find the balance between too much/too complex, and too simple or too low in information density. There is no single clear cut definition of what too much / not enough looks like, but don't just focus on concern around the design being "too complex." You might be making it "too simple." It's possible to design a product/service that is super easy to use, but with very low value or utility.

Maximum value, on the other hand, is obtained when an analytics solutions has the trifecta of utility, usability, and beauty/desirability. It is hard though, and usually quite expensive, to get a service to that level without intentionally designing it to be that way.

From a data scientist on Medium: “It’s easy to not understand your customer’s needs.”

So a guy walks into a bar and starts talking about unsupervised learning...

Ok, not quite. Well actually, where I live in Cambridge, MA, that's not really so improbable 😉

I found this article on Medium interesting, written by a data scientist talking in part about why data science projects may not be working, and one of them was "solving the wrong problem." He had come up with a new tool/method to solve an old problem, and it didn't go over well with the users, and so he concluded:

...It’s easy to not understand your customer’s needs. Pick your favorite large company: you can find a project they spent hundreds of millions of dollars on only to find out nobody wanted. Flops happens and they are totally normal. Flops will happen to you and it’s okay! You can’t avoid them, so accept them and let them happy early and often. The more quickly you can pivot from a flop, the less of a problem they’ll be...."

Dr. Jonathan Nolis, Data Scientist

I'm not writing this insight today to pick on Jonathan, after all, it looks like maybe he's based in AZ (my birth state) and hey: we're a friendly bunch! However, that first sentence (my bolding) isn't really accurate, and this is almost certainly part of what contributes to the 85% failure rates on data/analytics projects. No clear idea what the users need/want and/or vague business objectives. That said, as stated in my comment on Medium to Jonathan, I would agree that this not the primary responsibility of the data scientist.

If you're a biz stakeholder, product manager, or analytics leader, then you should be giving your product development team much clearer objectives. It is not difficult to understand customers needs; you just need to regularly go out and talk to them. While there is definitely skill involved in conducting great research and extracting business objectives from stakeholders, you don't need heavy training to get started. You may not discover every latent need or problem to solve, but it's definitely better than not talking to them at all, or taking wild guesses. After all, at some point, too many "flops" start to add up financially and otherwise.

I understand that with certain types of machine learning, there is inherent ambiguity in what might come out the tailpipe. However, you can spend a little more time, and probably a lot less money, doing a little research before committing the resources to implementation of something that may have zero value to anyone.

Here's to a few fewer flops, even if we can't stop them all!

It’s the translation that makes your data sing – what song are you singing?

Some of you probably know by now that I'm also a musician, and this includes composing for my instrumental jazz/chamber music quintet. What does this have to do with translation of your data?

When I talk about data in on the DFA mailing list, I'm usually talking about the raw ingredients that may (or may not) be translated into useful information or meaningful output via good design.

Data translation examples:

  1. The 1010101s of image files aren't useful to humans in their raw format. No information is conveyed to the viewer until the binary is translated by the computer into a picture that can be seen by eyes and processed by the brain into meaningful output.  (Sure, some exceptions with ML may challenge this notion, but hear me out).
  2. Got audio data? Same deal. The playback device has to translate the data into audio sound. Ears then translate the audio into meaningful output (perhaps music!)
  3. When I compose a new work for my quintet, it's in also in code at first and requires translation to become music or meaningful output. It's messier than this in real life, but goes something like this:
    1. Vague ideas in the composer's mind materialize into notated musical composition over days/weeks/months. This is a translation of abstract concepts into concrete ideas the composer can revisit, mature, and rework until the final composition materializes. While meaningless to the audience/listener at this stage, codifying my ideas into a musical score is a translation of abstract ideas into meaningful output for me, the composer.
    2. My composition's individual part assignments for each instrument are extracted from the full score and only then do they become ​meaningful output for the individual musicians. While I could put a full score in front of every player, this would introduce tremendous noise and difficulty for the players and would not benefit the listener. (How much noise surrounds the information your end users seek?)
    3. Performers translate individual parts of the score into audible music, or ​meaningful output. The scores are effectively meaningless to the audience/listeners.

So, my questions for you are:

Have your analytics translated the data into meaningful output for your end users?

Where could you be doing better translations?

Are you giving your musicians the full score, or just the parts they need to enable their contribution to the ensemble? (Remember: design is as much about what to take away as it is what to add).

Good luck!

The Easiest Way to Simplify Your Product or Solution’s Design

Ok, you probably know this one, but let's dig in a little farther.

I recently started to explore using the TORBrowser when surfing on public wi-fi for more security (later finding out that using a VPN, and not TOR, is what will enable safer surfing). However, in the process of downloading and trying the TORBrowser out, it provided me with a golden example of what you should not to do in your product.

The very first screen I saw when I launched TOR was this:

Image

So, what's the big deal here? First, I will share the answer to today's subject line with you:

Remove everything that is not necessary.  

​Yeah, yeah, you probably have heard that before. Famously, the pope asked Michelangelo how he knew what to carve while creating the statue of David, and his response was along the lines of, "I removed everything that wasn't David." Nice.

Are you removing the cruft and noise from your product?

If we take this thinking further, I would say that today's core takeaway for you is to "remove choices by making good assumptions in your product, wherever possible." You might be wrong sometimes, but you'll be a right a lot of the time.

Jumping back to the TORBrowser UI example above, there is more you can learn from their design choices:

  1. This UI says, "This [option] will work in most situations." Well then, why isn't this automatically selected as a default choice?
    Does this screen now seem necessary to you? Why does the product not just "try" that setting by default, and present the other option as a fallback, if the default fails? Nobody downloaded TORBrowser with a goal to "set it up" with the right networking settings. This entire step is not necessary. It literally says it's not "in most situations."
  2. Right information...at the wrong time. 
    I haven't needed to use this pane yet as the default setting worked (surprise!), but it's an example of the developers trying to present helpful information. That's good. The design problem is that it's appearing at the wrong time in my experience. I don't need this right now, and I don't even want to think about how the networking is configured. It's completely irrelevant. Are you presenting choices at the right time in your product?
  3. Most users don't care how your software works; don't expose the plumbing. 
    ​There are sometimes exceptions to this for certain technical products, but even when there are, once most users have "learned" what they need to learn about the plumbing, it quickly becomes irrelevant. The value has to shine, or people stop paying for the service. That includes products built for technical audiences.
  4. This UI and UX is not fun at all...especially as a first impression.
    It's a needless distraction, it's not fun, and it's got me focused on, "how hard will it be to get this app working?"
  5. The visual design attention (or lack thereof) is undermining the mission of the product. 
    This is the hardest one to teach, but a combination of graphic design choices (probably unconscious ones) here contribute to this UI not feeling particularly safe, secure, and careful. The goal of TORBrowser is to "protect" the user. If you think of words like protection, precision, stability, and safety, then the visual design should reinforce these ideas. The topic of graphic design is hardly something to be captured in an email, but I can leave you with a few suggestions and considerations. Refer to the diagram for a detailed analysis:Image

    1. What could be removed from the TORBrowser UI sample?
    2. Are the invisible things (like padding/margin whitespace choices) consistent, meaningful, and deliberate?
    3. While a single graphic design choice sometimes has the power to impact usability or even the financial the bottom line, typically, it is the sum of numerous small design choices that account for the overall perception of your product's quality and aesthetic.
    4. It's possible to follow "all the rules" and still not have a great product aesthetic or utility. (That's why we have designers.)

(8) invisible design problems that are business problems

Today's insight was originally inspired by a newsletter I read from Stephen Anderson on designing for comprehension, and I felt like this could be expanded on for analytics practitioners and people working on data products.

One of the recurring themes I hear from my clients is around the topic of general engagement (or lack thereof) by end users/employees/stakeholders that are supposed to be benefitting from the insights of SAAS data products or internal analytics solutions. There are a lot of possible reasons why your engagement may be low, but there's a good chance that the design may be a potential reason. Unfortunately, not all design issues are immediately visible or visual in nature, but you can learn the skills to begin identifying them.

So, why are they business problems?

For internal analytics practitioners, if your customers/employees/users are "guessing" instead of using the tools you're providing them, then ultimately, you're not affecting their productivity or professional growth, and the company's investment in analytics is not returning a positive result overall.

On the other hand, if you've got a revenue-generating SAAS product, lack of engagement has a direct bottom-line impact: renewals. How long until somebody of importance notices they're paying for a service they never use? Do you really want to bank your business success on auto-renewal alone? The long-term value play is creating an indispensable service.

Here are some problems I frequently see when designing for analytics that go beyond standard data visualization issues. You should be examining and resolving these on an ongoing basis, in a proactive manner. (If you're sitting waiting for passive feedback, you're unlikely to ever "see" many of these issues). Most of these are not "once and done" problems, with simple tactical fixes. Discovering these strategic issues requires adopting ongoing behaviors your organization should develop, if you want to be able to consistently deploy meaningful value to your customers:

  1. Usability issues: getting the value from the service is too difficult, too long, not worth the effort. The only way to spot this and really understand how to fix the real issues are via 1x1 testing of tasks with customers. There are tons of tutorials on how to facilitate usability studies, and you can outsource the testing as well.
  2. Utility issues: while the user can "operate" the design properly, there is low value. This can be a result of vanity analytics, or displaying the evidence before displaying the practical value stemming from the evidence. This sometimes presents, in customer-speak, as "I get what it is showing me, but why would I want this?"
  3. Timing or context issues: your analytics, while useful and usable, are not coming at the right time in the user's lifecycle of use.
    1. For example, you may be presenting information that is perhaps only useful at year-end, yet your tool doesn't know this and continues to persist the information in the UI as if it is meaningful signal mid-year. Right info, wrong time. Perhaps your tool should adapt to business cycles and anticipate time-sensitive needs.
    2. Another example may be a situation where a customer perhaps needs a cyclical (e.g. monthly) readout, but your tool requires them to log in and fetch the data instead of just notifying them of the information at the time it is needed. This doesn't mean you need to run out and create a scheduler for every aspect of your solution. On the contrary, this can lead to other issues.
    3. A third example goes like this. Ever heard from a customer, "this is great stuff, but I'm [in my truck] by that time and dont have my computer with me. So I don't use your tool very much." In this case, perhaps a mobile experience would have led to more engagement by the driver of the truck, and therefore, more value for him, and for the company. When was the last time you did a ride-along with your drivers? Did you even know you had drivers? The point is, the context of use [while-driving-a-truck] was not considered at the time the design was provided [a desktop solution].
  4. Domain knowledge issues: the information presented contains technical jargon, or advanced domain knowledge that customers do not have yet. You can't reliably know this without talking to customers directly, and you'll need to hone your interview facilitation skills to acquire this type of information. This is in part due to the fact that it can be embarrassing, or perceived to be a risk, for customers/end users to admit they don't know what certain things mean. Your job is to help them realize that you're testing your design, and it is the design that failed, not them.
  5. Ambiguous Correlation/Causation relationships: is your design declarative or exploratory? If it's declarative, did you provide the right evidence for this? If you're trying to show correlation, is it clear to the user what relationships you're delineating?
  6. You're building a framework instead of solution. I see this one a lot. Every UI view on every page shares the same "features," and over time, the design becomes dictated by the backing APIs or the reusable code snippets engineering doesn't want to rework on a case-by-case basis. The reality is that you shouldn't be forcing patterns on people too early, and if you're not rigorously validating your designs with customers, you have no idea what aspects in the design should really be "stock" UI features. A simple example is table sorting/filtering: your default control/view for this, while seeming to be "uber flexible," may actually cause UX problems because the customer cannot understand "why would I ever want to sort this table by X? Why would I want to filter this?" In your attempt to provide flexibility by automatically allowing every table view to be filtered and sorted, you actually just increased the complexity of the tool for no good reason. You might have shipped more code faster, but you didn't provide more value.
  7. "We're using agile." Agile is not the same thing as agility, and while this could be an entire post on its own, using agile doesn't guarantee successful deployments of value to users. A lot of the time, agile is a buzzword for doing incremental (not iterative) development, and more often than not in my experience, there is little, if any customer design validation (usability testing or discovery work) being done. The other thing with popular Agile methods (e.g. modified scrum) is that there is no formal design phase, and the assumption is that all design and coding can always be done simultaneously. This is not always true, and it's even less true unless you have a seasoned design practice within your organization that has properly integrated itself. It's also *definitely* not true if you're conceiving a brand new service or product. 
  8. Knowledge gaps or distributed cognition issues:  The best way I can think to explain this is with an example. Let's pretend we have an analytics service that allows employees to make projections/predictions about things such as bulk purchasing decisions of some good for the next fiscal year. In reality, the person who is going to make a final business decision using your analytics doesn't really have or rely solely on the information required in your tool. Through observation of their use of your service (not just asking them!), you might find that your customer is accessing 2 or 3 different systems before making the purchasing decision, none of which share data with each other. In short, your analytics solution is really just "part" of their overall workflow/process, and you haven't mapped the way they actually make a purchasing decision to your software solution.

Remember: you cannot just "look" at your tool and consistently identify these design issues. Even with tons of design training, an expert cannot just "see" all of these issues either. You have to go into the field, observe users, and run structured usability studies. Asking customers what they want or think is also unreliable, because end users are not always aware of their behaviors and actions, and you're likely to get an incomplete (or inaccurate) depiction as they try to answer your questions "intelligently."

Focusing on what people are doing is much more truthful and enlightening for making good design decisions.

Good luck!

What internal analytics practitioners can learn from analytics “products” (like SAAS)

When I work on products that primarily exist to display analytics information, I find most of them fall into roughly four different levels of design maturity:

  1. The best analytics-driven products give actionable recommendations or predictions written in prose telling a user what to do based on data.  They are careful about the quantity and design of the supporting data that drove the insights and recommendations being displayed, and they elegantly balance information density, usability, and UX.
  2. The next tier of products are separated from the top tier by the fact they're limited in their focus only on historical data and trends. They do not predict anything, however, they do try to provide logical affordances at the right time, and do not just focus on "data visualization."
  3. Farther down the continuum are products that have progress with visualizing their data, but haven't given UX as much attention.  It's possible for your product to have a *great* UI, and a terrible UX.  If customers cannot figure out "why do I need this?," "where do i go from here?," "is this good/bad?," or "what action should I take based on this information?," then the elegant data viz or UI you invested in may not be providing much value to your users.
  4. At the lowest end of the design maturity scale for analytics products are basic data-query tools that provide raw data exports, or minimally-designed table-style UIs. These tools require a lot of manual input and cognitive effort by the user to know how to properly request the right data and format (design!) it in some way that it becomes insightful and actionable. If you're an engineer or you work in a technical domain, the tendency with these UIs is to want to provide customers with "maximum flexibility in exploring the data." However, with that flexibility often comes a more confusing and laborious UI that few users will understand or tolerate. Removing choices is one of the easiest ways to simplify a design.One of my past clients used to call these products "metrics toilets," and I think that's a good name! Hopefully, you don't have a metrics toilet. *...Flush...*

What level is your product at right now?

Failure rates for analytics, BI, and big data projects = 85% – yikes!

Not to be the bearer of bad news, but I recently found out just how many analytics, IOT, big data, and BI projects fail. And the numbers are staggering. Here's a list of articles and primary sources. What's interesting to me about many of these is the common issue around "technology solutions in search of a problem." Companies cannot define precisely what the analysis or data or IOT is supposed to do for the end users, or for the business.

And, it hasn't changed in almost a decade according to Gartner:

  • Nov. 2017: Gartner says 60% of #bigdata projects fail to move past preliminary stages. Oops, they meant 85% actually. 
  • Nov. 2017: CIO.com lists 7 sure-fire ways to fail at analytics. “The biggest problem in the analysis process is having no idea what you are looking for in the data,” says Tom Davenport, a senior advisor at Deloitte Analytics (source)
  • May 2017: Cisco reports only 26% of survey respondents are successful with IOT initiatives (74% failure rate) (source)
  • Mar 2015: Analytics expert Bernard Marr on Where Big Data Projects Fail (source)
  • Oct 2008: A DECADE AGO - Gartner's #1 flaw for BI services: "Believing 'If you build it, they will come...'" (source)

There are more failure-rate articles out there.

Couple these stats with failure rates for startup companies and...well, isn't it amazing how much time and money is spent building solutions that are underdelivering so significantly? It doesn't have to be like this.

Go out and talk to your customers 1 on 1. Find a REAL problem to solve for them. Get leadership agreement on what success means before you start coding and designing. There's no reason to start writing code and deploying "product" when there is no idea of what success looks like for both the customers and the business.

Skip the design strategy part, and you'll just become another one of the statistics above.

If you want to learn how to leverage design to deliver better UX for analytics and enterprise data products, subscribe to my DFA Insights Mailing list. 

My reactions to the Chief Data Officer, Fall 2017 conference summary

I ran into a an article about the Chief Data & Analytics Officer, Fall conference that summarized some of the key takeaways at the previous year's conference. One paragraph in the article stuck out to me:

...
The Great Dilemma – Product vs Project vs Capability Analytics Approaches
Although not one of these approaches will provide a universal solution, organisation’s must be clear on which avenue they’d like to take when employing enterprise analytics. Many speakers discussed the notion of analytics as a product/service, and the importance in marketing that product/service to maximise buy-in and adoption. However, analytics executives may look to take a capability-based approach, but one cannot simply build an arsenal of analytics capabilities without a clearly defined purpose and value generated for the business...

(Bolding added by me)

For companies pursuing internal analytics solutions, or creating externally-facing data products or solutions, the situation is basically the same: you cannot start with a bunch of data and metrics, visualize it, and then hope that you have a product/solution somebody cares about. The data isn't what is interesting: it is the actions or strategic planning one can take from the data that holds the value. You have to design the data into information, in order to get it to the point customers can grok this value.

I have found engineering-lead organizations that tend to operate in the "build first, find problem second" method, looking at design as something you bring in at the end to "make it look all pretty and nice." A good UX strategy is a good product strategy is a good analytics strategy: by spending time to understand the latent needs people have for your analytics/data up front, you're much more likely to generate a solution that solves for a need on the other side.

Video Sample: O’Reilly Strata Conf

This is recording of my presentation at the O'Reilly Strata Data Conference in New York City in 2017.

Do you spend a lot of time explaining your data analytics product to your customers? Is your UI/UX or navigation overly complex? Are sales suffering due to complexity, or worse, are customers not using your product? Your design may be the problem.

My little secret? You don't have to be a trained designer to recognize design and UX problems in your data product or analytics service, and start correcting them today.

Want the Slides?

Download the free self-assessment guide that takes my slide deck principles and puts them into an actionable set of tips you can begin applying today.

Getting confidence in the value of your data

(As shown to customers in your UI)

I'm talking to a prospective SAAS client right now, and they're trying to expose some analytics on their customers' data so that the customers can derive ROI from the SAAS on their own. The intent is that the data can also be useful to the SAAS sales team, as a tool to help prospects understand what the possible ROI might be.

I had a question for Dave around whether the project would be successful if we talked to the users, designed a bunch of interfaces, solicited feedback on the design outputs, and found out that the data, while interesting, didn't really help the customers derive ROI. Would the design engagement still be productive and a success in the end? Ultimately, I didn't want to take on a project if we had hunches that the data we had, while being the best possible available data and elegantly presented, may not help the end user or buyer calculate ROI.

Here's what Dave told me:

Yes, the design engagement would still be a success. It provides us a punchlist of what else we need to do, which is in-and-of-itself is useful; and presumably defines what the analysis/reporting needs would be once we get that data. Less of a success, or more of a delayed-gratification one, but still useful.

I thought this was interesting to share, and I hoped Dave would say this because it shows that sometimes, you have to do some design to figure out what the final design needs to be. You can't always plan ahead what the right solution is and moving from designing on assumption to designing on fact is powerful information to inform your product.

Conversely, you can also spec out the entire project, including all the data/queries that customers said would be useful, write it into a spec or backlog, code it up, skip design, and then still have it not be successful because customers couldn't actually experience the ROI that your data was supposed to convey. A product backlog does not = a viable product. It's just a bunch of user stories or features. The glue holding them together, and what helps customers realize the ROI, is design.