Heads up! Registration for my March Seminar ends March 14
The public version of Designing Human-Centered Data Products only happens twice a year, and the next session begins March 15, 2021. Details/Register

Intergalactic data infrastructures != customer value

Ok, let's dive into another reader question. This time from Loris via LinkedIn originally:


Hi Brian,

I hope you are well and don't mind me reaching out directly.

Human-centered design resonates with me and my experience with data so far. You have been an inspiration for me over the past few months - thank you!

In my last role as a Data Architect I worked with a team of engineers to migrate the analytics platform to Snowflake. I feel I was the only one in the team advocating for data discovery sessions with the users, to really understand their needs and use cases before writing any code. Eventually that led me to change course, and now I run my consultancy company here in Sydney focusing on enabling analytics (data architecture and analytics engineering).

The pattern I see is that at some point a company (mostly startups and scaleups so far) decides to use their data. The CFO or CTO is put in charge of hiring and leading a data team, typically following the Spotify model to build a data platform.

In part because of lack of vision, in part because this is considered "technical" work, teams tend to skip the design part and start working relentlessly to ingest and model data into the warehouse. Engineers are under a ton of pressure to deliver something tangible for the business, which is hard when progress is measured in number of lines of code. Eventually the business looses trust in the project and everyone feels miserable.

I wonder, have you found a similar trend in SMBs and data-immature organisations?

Does it get better in larger organisations?

What type of organisations resonate more with the type of work that you do?

Just a curiosity.

Cheers,
Loris


My take?

Just buy Snowflake - it will fix everything.

Just kidding.

Actually, one thing I think I'm hearing more from leaders (which is GREAT) is that it's not the technology that's the problem. "The technology part is easy." Well, it is -- if you're focused on outputs instead of creating outcomes. Every year, it gets faster and easier to deploy more tech and tools. The question is, are we getting better at delivering value within data-drive products and applications? Are our software/data products being used more (or even less, which may actually be the real goal?)

I think part of the problem here is that some technology leaders still see things like design, storytelling, and user experience as "soft skills" -- something secondary and "nice to have." They may have came up with a business background or engineering background and see the world through the lens of technology or numbers. "Individual customers" seem too granular—not enough "data." So, the customer gets abstracted. 

The sharper data leaders see non-technical skills as essential, particularly if they're trying to change a culture from the old way to the new way (as is the case with many technology companies who disrupt or create new categories).

Within this category of "sharper," I'd say there are couple levels of maturity:

1) Basic: leaders see the importance of design and design thinking as a risk-reducer for engineering/technical work. They've danced the dance enough times to know, "hey, this matters. If we don't get this right, everything else fails. We need some of that to get this right."

2) Advanced: these technology company leaders—typically the ones who are digital natives and have led a successful software company before—understand that design can be a source of innovation and strategic advantage. They're generally accelerating their learning through rapid, smaller iterations, they try to show customers value earlier, and they try to design for easy sales/adoption and retention. They see design as everyone's job—and they want their technical leads thinking about customer value and experience too. They may be leveraging design for both end-user facing services/products/applications as well as internal tools and enabling technology. They're thinking about things such as design thinking applied to ML Ops (link courtesy of Ahmer Inam, the Chief AI Officer of Pactera Edge who was on my podcast for episode 20).

One other takeaway.

When you're data-driven software IS the company or it has to produce revenue, I think the story changes and you're playing a more advanced game.

This is partly why I encourage internal teams to think about their [free] internal data-driven solutions as "data products" and not projects.

I ask them, "what would you change if customers had to pay for this service and you were being scored on how much revenue this application brought in?"

As to Loris' question about company size:

I think smaller companies tend to get this

Leaders at larger organizations are often quite disconnected from the products and solutions the people under them are tasked with creating. Many are NOT in the mindset of people like Elon Musk and the ideas he shares here—like this one: "Most leaders spend much too much time on things like finance and management and too little time on the actual product or service they're providing."

So, in general, I think smaller companies—particularly those who have product-oriented leadership—tend to "get this stuff right" earlier. But, it's really about WHO the leaders are and what the culture is. Is the culture about technology? Or about the outcomes the technology can enable?

I've had Fortune 500 clients who built software with a relentless focus on engineering and very little regard for the customer value or UX quality.

Why?

Part of it is that it's easier to measure the easy-to-measure things, and people don't want to hear that an initiative may be WAY off the mark.

They don't want to hear it's a year behind.
They don't want to hear that early technology choices made 6-12 months ago are now impediments to the user experience and value that they want to create "on the front end."
They still see the front-end, last-mile part as "something small" relative to the massive plumbing and architecture required to  enable the human experience in the last mile.

The larger the company, the easier it is to measure the wrong things as project success metrics. Lines of code, number of models or reports created, Salesforce releases, Scrum/agile/project mgmt team metrics, and the like.

Why do we count these things?

Because the leadership has not yet defined what "good job" and "home run" actually mean from a customer perspective and how the org will measure progress on that criteria along the way. Like literally down at the level of individual humans—since after all, governments and companies are really just collections of people. There is no "sales department." There are just sales people.

So what to do about this, Loris?

Here are two ideas:

1) If you're upgrading or redesigning a data-driven application and worried it's not headed in the right direction,  try to get your leadership exposed to the struggles of users using the old solution (or your new design prototype if that's in the works). Yes, individual users. Don't focus on quant data for now, such as analytics on your analytics adoption. "27 out of 52 people in marketing used the dashboard at least once last week" might sound informative, but it's probably notgoing to have the same impact as showing Sam from sales using your data product and struggling to get the promised value out of it. You can also do an assessment/audit of your service and share the findings—relating them back to the over-arching goals your leader cares about. You can also conduct your own assessement/audit (download my free DFA Self-Assessment Guide) or get outside perspective and help.

2) If you're creating a net new data product/application, get the leadership team to establish some benchmarks for UX quality that relate to the business objectives. In other words, "how will we know if we did a good job?" What will we measure, when will we measure it, and how will we measure it? What's the difference between a base hit and a home run for Sam in sales? How will Sam's life be improved if we get this right? This is what I do with clients in my Design Strategy Roadmapping service, but you can do it yourself too. The main thing to get right is to get a commitment that the team is actually willing to make changes if UX feedback captured along the way is saying, "this isn't what we want/need. We don't get it. Why would I want/use this?" A strategy roadmap is a map, but it is not the territory. It's a direction to head in—and validation is needed along the way.

And finally, WRT the question about which orgs resonate more with the work I do? The answer to that is at the top of my work-with-me page.

Hope that helps!

 

Photo by Michael Dziedzic on Unsplash

More Free Insights: