Low engagement; it’s a common challenge for many of my clients and the people I talk to in the analytics world. In fact, it’s the subject of my talk this year at the IIA Symposium next week.
One of the concepts with design I like clients to think about with analytics is around whether your software is more of a place to hunt for insights (i.e. “exploratory”) vs. one that is supposed to alert users to the interesting signal in the data (i.e. “declarative”). It’s probably not a surprise, but most of your users probably want the latter, as the tool time required in the former may not be fun, simple, or repeatedly valuable to them. Most of the time with these analytics tools and goodies, the customer just wants to know, “what’s interesting; why should I care; what should I do about it.” The more you can offload the “labor” to computers, the better. The more work you put on customers to use your tool to extract this, the more chance you’re not going to deliver value, and the engagement is going to be lower. Why?
It’s fairly difficult to design simple, elegant, valuable solutions for people–even when you know what they need, and you can build software to provide that at the right time (e.g. “declarative” analytics). It’s MUCH harder to build a well-designed, general purpose analytics platform that is going to be usable for all the various use cases that are out there.
These days, I hear lots about “training” and “data literacy” as the solution to getting end customers–especially less technical ones–to use these expensive data products and analytics platforms.
I tend to be suspicious when a tool is so complex it is “dependent” on training. Bear with me for a second.
While there may be valid edge use cases that require help documentation and maybe even training, a reliance on training is usually a sign that the system is not well designed around the tasks and goals of the customer.
It’s way easier, and cheaper, to change the tool than it is to change the people using it.
That’s what design helps us do.
What’s the problem with training?
It’s easy to go out of date – especially if you are using software that is in “always beta” mode (e.g. the team behind it is always improving it, shipping regularly releases, etc.). Not only might the training not be particularly effective, but you also have all this process, documentation, staff, and resource required that needs to stay up to date. It’s like putting a giant rock inside your hiking backpack. On top of that, you will always have employee turnover, and training assets like copy, screen shots and videos get out of sync with the production version, requiring you or the vendor to keep all that stuff updated. It’s a major tax. Finally, training takes staff away from their actual job; the list goes on.
More importantly:
- If the tool sucks or just doesn’t work right, training isn’t going to fix the underlying problem
- You haven’t made the customer’s life or job any better or happier by training them to use a poor solution
- You’re probably not making any “friends” or colleagues that want to spend any more time using your current (or future) tools if they mostly see it as a “tax” on their time and job.
- Every time you bring on new users, you incur the repeated cost of training.
- When you add features to an existing, poorly designed tool or platform, you’re likely just going to make the UX even harder, requiring more training, etc.
So, back to design. What do you do?
Well, for one, you can look at what areas require the most training. What are the workflows being “trained”? Can you reduce the effort? Probably. There is probably great information in the heads of whomever is the recipient of “help requests.” What features or workflows generate the most confusion? What are the high-volume complaints/requests? Use this information to decide where to invest in better UX and design.
So, all of this is generally good advice for almost any product. But, how do we relate this back to analytics?
Let’s revisit the topic of “exploratory” vs. “declarative” analytics experiences. As you recall, “declarative” means the tool is generating conclusions using software and exposing those hopefully at the right time and place. “Exploratory” means that–intentionally or otherwise–users feel like they have to “piece together” the experience themselves, just to get the desired output. They may have different levels of knowledge about domain, data, and technology–but the tool onus is largely on their shoulders. This is a ripe area for “low engagement” because the system is not purpose-built, and it doesn’t know what data/conclusions to surface for each particular user. I get it: there aren’t resources to design experiences and views for every single use case so instead, the analytics team provides a toolbox, not a “solution.”
Why aren’t there resources?
Because most people think that solving analytics problems starts with data, data science, and engineering. Building stuff.
It doesn’t, unless you’re ok with the potential financial and time risks that come with approaching the problem that way.
Just today, I interviewed a CEO for an upcoming analytics-in-IIOT themed episode of Experiencing Data, and he told me about how not understanding his end customer cost his software company six months of engineering time. On a rough estimate, in the US, that’s probably $10-20k per engineer per month in wasted cost, plus the “dead weight” the product now must carry, and the lost opportunity resulting from pursuing the wrong path. All because they didn’t have any design plan and thought they knew what the customer needed.
“I won’t make that mistake again,” he said!
You shouldn’t either.
And, if you think $10–20k/month/engineer is a high cost, it’s not.
I recently worked on a large enterprise data product/platform that had 500+ engineers working on it. It was probably two years behind schedule the day I arrived. Now you’re talking about millions of dollars in wasted expenditure.
So, how does this relate to declarative and exploratory analytics?
In general, the rule is this: as the number of data points, stories, and metrics grows, the more care you will need to give to design in order to ensure all of the analytics are actually usable and useful to your customers.
Declarative analytics are easier to design around as they put much more analysis on the computers. Exploratory analytics services effectively “offload” this work requirement onto humans. As such, more design care is needed for exploratory analytics when you can’t offer predictive and prescriptive analytics capabilities.
If the customer’s job is not to be a data analyst, then don’t expect them to be able to “design their own” UIs successfully, and repeatedly. Most users assume that any tool that has a GUI was deliberately designed to be effective and useful already; they don’t assume they need to bring those requirements into the tool in order to do their job.
Software products with exploratory analytics are harder to design effectively. You really need to understand the customer problem space intimately in order to know what experience to try to craft, and the resulting UIs. Your goal should be to reduce tooling effort as much as possible, especially when the end user is either non-technical, or they may not know your domain well. This is especially true for internal analytics systems where you may be on a mission to “arm the employees with data-based decision making.” If they aren’t using data now, and perhaps haven’t ever thought about how a customers’ “response to email marketing” has anything to do with their job, you can’t expect them to know how much to value, care, or utilize “open rates,” “click tracking,” “lead scoring,” and CLV. That’s Customer Lifetime Value if you aren’t familiar with it, and I bet some of you weren’t. And while you can quickly Google the term, that doesn’t mean a user–who doesn’t work in email marketing–can quickly understand how all the sudden CLV metrics will help them do a better job at work.
Their goal may be to “decide how many widgets to buy next month” and they’ve been told the company strategy is “use data to back up decisions.” Do you see the gap? While we all have the capability to learn, the business may want this person’s expertise in procurement to be the focus of their job. That’s their domain, and they know it well. If CLV can help them, great, but your job isn’t to just “expose the CLV stats to them” and then hope it works.
In my opinion, companies that invest more in design to reduce analytics tooling overhead and cognitive load on the end users will surpass those that do not. It’s simple: if the tools are more valuable, useful, and more engaging, they will be used more and–gasp–maybe even depended upon. If you don’t have the resources to design better UIs and experiences for all of your different users of your data products and analytics, then I might suggest you pick from the following:
- If you’re new to design, choose a high-frequency customer/user who has shown interest in trying to use data, but perhaps is struggling in being effective. Capture a benchmark if you can. Get to know this person, their job, their routines, and what would make them feel successful. Design some prototypes that seem to compliment this user’s tasks/workflows, and test those with this person before you write more code. Involve them along the way, and then test again when you go into production as well. Follow up later with them to see if they’re still using the tool successfully and then see if you’ve improved on your benchmark, whatever that was. (Be careful with “time spent using the tool;” more time is not necessarily better – in fact, what you probably want is “the least engagement that is required to provide meaningful insight.”)
- If you have more experience with design and can afford to tackle a harder project, considering deploying more design resources on an *expensive problem* to solve really well, giving these users’ pains and problems more design love so that you can really move the needle with your data and analytics solutions. This is in contrast to “spreading a little attention around to everyone.” This will help build credibility for your team and you may be able to show significant ROI on the effort if you track some metrics before and after. More design love does not mean “use AI and ML.” In fact, I am specifically talking about using design to improve the experience of systems that are still using “exploratory analytics” that require the user to do more eyeball analysis and tooling. You don’t have to solve the entire problem for the customer, but it could mean, for example that you design a much more customized workflow for these users that other departments may not get [yet] because there is a greater potential for gain if this department is enabled with better tools. While you can always look at “savings” metrics as the goal of design (e.g. reduced labor costs, engineering costs, etc.), to me, the real goal is to show “growth.” What growth did you enable that wasn’t previously possible or easy to do with the old analytics solution?
One final note: not all design and UX outcomes are going to be economically quantifiable, and you have to create or be an environment that let’s you practice, fail, and try again. A customer goal such as, “Give me confidence to set our company’s pricing for next quarter more efficiently," is not economic at its root. To the business, this goal is economic, but the human customer is likely thinking about other things–such as their job security, reputation, bonus, and credibility. In other words, for this user, the goal is very much emotional and psychological.
While later on, you may be able to correlate this customer’s past-quarter pricing decisions back to the analytics that informed the decisions, during the moment, this customer is looking for confidence while they’re using your service. I’m not talking about statistical confidence intervals here; I’m talking about the overall confidence, trust, and value they place in the software/service you crafted for them.
For many of you, this may sound like fluffy, gray, squishy stuff that is not just math, engineering, and data points.
You’re exactly right: it is very much those things.
It’s about human beings trying to make decisions.