Scott Friesen’s transformation into a data analytics professional wasn’t exactly linear. After graduating with a biology degree and becoming a pre-med student, he switched gears and managed artists in the music industry. After that, he worked at Best Buy, eventually becoming their Senior Director of Analytics for the company’s consumer insights unit. Today, Scott is the SVP of Strategic Analytics at Echo Global Logistics, a provider of technology-enabled transportation and supply chain management services. He also advises for the International Institute for Analytics.
In this episode, Scott shares what he thinks data scientists and analytics leaders need to do to become a trustworthy and indispensable part of an organization. Scott and I both believe that designing good decision support applications and creating useful data science solutions involve a lot more than technical knowledge. We cover:
- Scott’s trust equation, why it’s critical for analytics professionals, and how he uses it to push transformation across the organization
- Scott’s “jazz” vs “classical” approach to creating solutions
- How to develop intimacy and trust with your business partners (e.g., IT) and executives, and the non-technical skills analytics teams need to develop to be successful
- Scott’s opinion about design thinking and analytics solutions
- How to talk about risk to business stakeholders when deploying data science solutions
- How the success of Scott’s new pricing model was impeded by something that had nothing to do with the data—and how he addressed it
- Scott’s take on the emerging “analytics translator” role
- The two key steps to career success—and volcanos 🌋
Resources and Links
Quotes from Today's Episode
“You might think it is more like classical music, but truly great analytics are more like jazz. ” — Scott
“If I'm going to introduce change to an organization, then I'm going to introduce perceived risk. And so the way for me to drive positive change—the way for me to drive adding value to the organizations that I'm a part of—is the ability to create enough credibility and intimacy that I can get away with introducing change that benefits the organization.” — Scott
“I categorize the analytic pursuit into three fundamental activities: The first is to observe, the second is to relate, and the third is to predict. ” — Scott
“It's not enough to just understand the technology part and how to create great models. You can get all that stuff right and still fail in the last mile to deliver value.” — Brian
“I tend to think of this is terms of what you called ‘intimacy.’ I don’t know if you equate that to empathy, which is really understanding the thing you are talking about from the perspective of the other person. When we do UX research, the questions themselves are what form this intimacy. An easy way to do that is by asking open-ended questions that require open-ended answers to get that person to open up to you. ” — Brian
Brian: I'm super happy to share my conversation with Scott Friesen, the SVP of Strategic Analytics at Echo Global Logistics. Scott and I originally met at a conference, and we found out we both have backgrounds in the performing arts originally. And Scott's a really interesting dude and his approach to making successful data products and analytics solutions, is very much rooted in relationships, empathy, trust and intimacy with stakeholders and the people that his team is building solutions for, and so I wanted Scott to come in and share like how does he do what he does in order to get people to use the predictions and the models that his group is putting out because at the end of the day, the technology itself is not enough to create value in the business. There has to be engagement in the last mile with the people that he's creating these tools for.
So, Scott is going to share some of his journey and his approach to creating value with analytics and data science and I think you're really going enjoy this conversation. So, again, this is Scott Friesen, the SVP of Strategic Analytics from Echo Global Logistics. Next on Experiencing Data.
Welcome back to experiencing data. We have Scott Friesen on the line today from Echo Global Logistics. Scott, are you there?
Scott: I am here.
Brian: Sweet. How's it going? It's Friday.
Scott: It's going well. It's a beautiful day here in Chicago. It's a gold cup.
Brian: Nice. Excellent. I've just... I was watching an old SNL. My wife is Polish, and so I was like, "Oh, you have to watch Bill Swerski's Superfans from Saturday Night Live." This Saturday Night Live, I was like … was here totally didn't get it. The sausages and beer is funny, but I'm like, "This is what American people think about Chicago."
Scott: It wasn't culturally authentic, is that what you're saying? We didn't that target?
Scott: I actually went... my son had hockey tryouts last night, and I actually went to the Little Goat Tavern, or the... sorry, the Billy Goat Tavern, which is of the famous John Belushi cheeseburger, cheeseburger, cheeseburger, Pepsi, Coke. That whole skit from the '70s, like the early '80s, like that comes from that actual place in Chicago. And I'd never been there before, even though I've lived here several years. So, I got a cheeseburger.
Brian: Excellent. Well, I would love to talk about sausage and grilling and all of that, but I think we have some cooler things to talk about with you-
Scott: That's a different podcast.
Brian: That's a different podcast, and let's narrow it out to some analytics. So, we've met each other I think at the International Institute for Analytics symposium.
Scott: That's right.
Brian: You have a music background like me and I thought it was interesting when we did the planning call for this, you talked about something that you tell your graduate students in one of the courses you teach. So tell people who you are and then I'd like you to repeat what you told me about how you think music relates to the craft of making data products and analytics solutions.
Scott: Sure. So, full disclosure, my undergraduate degree is from a liberal arts college. I went to Swarthmore College as an undergrad, and that definitely informs a lot of this holistic point of view that I have. I was a biology major and was actually originally pre-med, so took a lot of natural science classes, but also took Shakespeare and so... So, I had that sort of educational background. I then moved to New York City and got into the music industry, and I was an artist manager. I worked for Bobby McFerrin's management company. I started my own management company, represented a terrific band and a wonderful singer, songwriter and had some successes there, but ultimately not enough success that that's what I'm still doing because that also would be a different podcast.
But then decided that the business side of the music business was more appealing to me than I had anticipated, and so I went and got my MBA at Columbia University in New York, and then joined the big corporate game. And I've had largely a background in retail. I worked for Best Buy for quite awhile, Ulta Beauty, the cosmetics retailer I did a stint in consulting, and now I've found myself over in the logistics side of things, and... But, yeah, what you referenced about the music is something that I have this belief that great analytics is a lot like great jazz. That the good analytics is highly structured. You might think of it like classical music, but truly great analytics is more like jazz in that you've got a structure, a meter and key and lead lines and so forth.
That's the business problem that's presented to you. There's something that the business needs to accomplish, and then you've got your ability to play your instrument, which is your techniques, your skills, your bag of tricks that you've developed over time and that you're comfortable with. And then when it comes time to apply what you know against the backdrop of the business problem, you're trying to exercise, not only technical mental muscles, but also creative ones and come up with new solutions, new ways of seeing things.
So, yeah, I probably took that metaphor soon absurd extent when I taught the MBA students at Loyola Business Analytics because I actually opened the class showing them a jazz chart and then used to play that play jazz music at the opening of every class to try to reinforce that message, but I think it also helps with making sure that the analytic industry or segment of the world doesn't come across as too dry, too unapproachable, too academic, because ultimately that impression actually impedes our success ultimately in terms of fact-based decision making. Sure.
Brian: Sure, sure. I agree with some of the parallels there like you can learn all the technique and the theory behind jazz or any form of music, and you can then apply all that stuff with a lot of technical precision, and then end out putting out a performance or record that still doesn't connect with people, so it's not enough. And I just heard this at a conference I was at yesterday, it's like, it's not enough to just understand the technology part and how to create great models, how to make sure the data is the right data, you can get all that stuff right and still fail in the last mile to deliver value there. And that's what I wanted to talk about today was your work. So you're the SVP of Strategic Analytics, correct? At Echo.
Scott: That's right. Yup.
Brian: What are the some of the things you're doing to ensure that your department's work matters in the sense that it's actually getting used and people are, maybe they're coming to you and instead of pushing tools on them, they're saying, "Hey, come to us." I assume that's the case, that there's some kind of relationships you've formed there. Tell us about how you do that.
Scott: Yeah, so a couple of things. I'm going to start at the most philosophical, and then try to work my way towards more tactical.
Scott: So, I just had a new guy join our team and he just started at the beginning of this week, and so I did my orientation with him as I do with anyone who joins the team, and one of the things I shared with him that I've been doing with every orientation for many years, is this framework called the trust equation. And I have to give credit where credit is due. My first boss at Best Buy, Kal Patel, actually he was the one that taught this framework to me, but it's extremely helpful, especially for an analytic professional to have this sort of a actual equation to diagnose change management and human interactions circumstances. So the equation is, trust equals credibility times intimacy divided by risk.
And so if you dissect that a little bit and say that credibility comes from doing what you say you're going to do, doing it skillfully, being reliable and being useful, then that's how you gain trust by showing up with credibility. The second piece is intimacy, and the thing about intimacy is to me the sort of two parts. One is you're not just the name at the bottom of an email address. People have to actually understand that you're a fully formed human being. There's a reason why business trips still exist. There's a reason why you and I met in person at the IIA symposium instead of just attending that via web conference. That interaction matters. And the other part of it though is showing the empathy to understand other people's needs and what's motivating them.
And this is where I think some of the last mile starts to fall apart, is the ability to really listen deeply to whoever your internal client is. If it's a business person or an operations person or a finance person or an HR person or whoever it is that has a need, putting yourself in their shoes, understanding what's motivating them, is about building the intimacy of that connection and that relationship and makes a massive difference in terms of the success of your work and the development of that trust. So those are sort of the things that are driving it up, driving trust up, is increased credibility, increased intimacy.
And then underneath credibility times intimacy, you have risk. And the thing about risk is there can be real risk and there can be perceived risk. I find that more often, what you're really working with is perceived risk. Sometimes the risk is real and you're changing something fundamental or something important, but other times you're asking someone to do things a little bit differently or operate in a different way, consider things differently, and those can come with a perception of risk that you have to be aware of. I had a finance professor years ago who said, "The nature of value creation has changed." And that always stuck with me.
And so what that means in the context of the trust equation is, if I'm going to introduce change to an organization or to a group of people or to an individual, then I'm going to introduce perceived risk. And so the way for me to drive positive change, the way for me to drive adding value to the organizations that I'm a part of, is the ability to create enough credibility and intimacy that I can get away with introducing change that benefits the organization.
Because if that equation flips and it becomes less than one, then my... the risk perception has become more than the buildup of my credibility and my intimacy, now people aren't going to go along with whatever it is that we're trying to do. So it's a little headiest, little philosophical, but I do find that when I'm coaching members of my team or when I'm diagnosing different situations, that I can often identify if there's something that's not advancing, trying to understand which of those puzzle pieces is not where it needs to be, is a very, very helpful way of figuring that out.
Scott: Yes. Yeah, exactly. Yep. Yeah, I'll never forget I actually, before I came to work at Echo full time, I was a consultant and we were working on this one particular project in this iKey system and I was watching this business analyst for iKey in a meeting with one of the senior most executives. And I just... I was observing this dynamic where he kept trying to land a message and the executive was just not buying it. And I watched him approach it over and over, and it was hard to watch because I felt like I was watching this young man just smack himself into a play class window over and over and I had wished didn't want that to be happening for him.
Scott: Painful. It was painful to watch. But I... what I observed in that case is that he had not established adequate credibility, and he was introducing something that the business leader perceived as risky. And so, I was watching and I asked myself like, "Is this an intimacy thing or credibility thing?" And in that particular case is very clear, because I knew that the executive liked him individually. So it wasn't like an intimacy thing. It was a credibility thing. And he hadn't established yet and take the time to either establish your credibility or figure out how to reduce the perception of risk in that situation. So, I have found it to be really helpful in terms of diagnosing some of those things, so then as you translate that, so that's very lofty, like I said, liberal arts background.
So I look... I liked the philosophy class that I took as well. But as you translate that down into a much more tactical arena, you find that this trust element translates in a lot of different tactical ways. So, one is a business leader asks for an answer to a question that may be deceptively simple. They ask for why some number has changed or moved, and it seems on the surface like a very simple question, but it turns out there's a lot of moving parts that push on the numbers for, well, that's true for most businesses, and trying to dissect that and pull apart what was actually doing it may actually involve some complex statistical modeling or other approaches, and so the ability to come back and land that information successfully in a way that the executive believes and then can act on, again, comes back to that trust element.
Scott: And then the other thing in terms of the last mile, and this is something I actually really enjoyed about that symposium we were at together, there were conversations about this notion of the last mile of analytics. There is the consultative aspect of what we do, which is what I just described, where business person has a question and we do work to try to produce an answer or an insight to help them with that given situation or, one of the things I'm loving about my current role at this particular company is the ability to operationalize or institutionalize that much broader scale answers to particular things. In a lot of cases that's what should this price be? Or what do we think this cost is going to be? Or what is this forecasted value? And that's where the last mile becomes very tactical and very technical.
I have to maintain a very close relationship with my partners in iKey, and I think, I feel like a lot of times we're probably like siblings, which is to say, we understand each other very well, we like each other a lot. Every once in awhile, we will fight the way siblings do, but there is a key interdependency between my function and iKey and that last mile of executing and putting at scale into the hands of our hundreds of sales reps or operations people, results and answers of models that we've produced is a big part of that. And so that's how I'd also translate that to answer that question in terms of how we actually close the gap on the last mile.
Brian: Got it. And to make that tactical, so a couple of things here. I tend to think of this in terms of what you called the intimacy. I don't know if you equate that to empathy-
Scott: I tend to.
Brian: To talk a lot about that, which is really understanding the thing you are talking about from the perspective of the other person, the work they're doing, their job, all of that. And so would you agree that the way to get to that has a lot to do with how you ask questions and what questions you ask and that is... it really can be as simple as is coming in with really good questions to get... to dig into their world, that itself is what helps establish the credibility. Like when we do "UX research" or this discovery process or, there's a lot of different names for it. Why questioning, it's the questions themselves is what forms the intimacy that you're talking about.
I just, yesterday when I gave my talk at the Mini Boss Conference, I gave people, for data products and analytics solutions, here are a bunch of questions. And one of the main things with those is that they're open-ended questions. They're not, do you do this or do you not do this? They tend to be... they require open-ended answers to get that person to open up to you, and that intimacy can be formed simply through that questioning process. Do you agree or do you think there's other more important ingredients than just the question choices that go into forming that really tactically?
Scott: I think the question choices, I think you're exactly right that they're critical. I do actually believe there's two other elements. I think, I don't know how I'd rank their importance, I would just say all three are important. The question quality and the nature of the questions you ask. You're absolutely right. And it's actually one of the things that I like about your podcast and your lens on all of this is the intersection of design and analytics because that's how I try to come at things as well, and to me, great design thinking begins with great questions and great understanding of the subject of whatever you're seeking to solve for with your design, but there's two other pieces. One is, I've said at at least one conference term that you need to care about other people's success, and you need to communicate that you care about other people's success.
And I say that almost as a prerequisite. It's like most folks will talk about the need to establish rapport. Whether you're a salesperson or a psychotherapist, you have to sort of before you're going to dig in and get into all the questions, it's usually incredibly helpful to establish some basic rapport. And one of the things that I try to communicate in the earliest interactions with a business person, is a certain degree of genuine interest enthusiasm about what they do, and that I am interested in them being successful in that endeavor. Then when you get into the questions, there's less likelihood of them wondering why you're asking it-
Scott: Because now you've laid a base of, they understand that you want to know those things so that you can best help.
Scott: Not because you are trying to call into question their ability or something like that.
Brian: Sure, sure. I probably glossed over that in my head. That felt like why do I want to spend an hour talking to you about your job? It's not so I can see if I can mechanize it and replace it, it's like I actually want to see if I can help you close more sales at a higher per widget price.
Brian: Like it isn't talking about how I can help do that with some data. I'm not here to replace your job, I really actually think we can improve the time you spend on the phone, on the board. I can and, yeah.
Scott: Right, right. So that's... so the first piece to me is trying to establish or communicate in any way that you're concerned about they going well. Then the second is exactly what you said, which is well-formed questions, thoughtful questions, open-ended questions that are well designed. There's a third piece to me, which is trying to listen with minimal filtering. Which to me is also a great design mindset-
Scott: Principle. Try to remove your own biases, try to be aware of your own ladder of inference that might be producing filters inadvertently. Try to truly hear the person where they are and not be spending too much brain energy on over interpretation in the moment. And I've had over the years, some conversations with some folks that have left companies, and one of the most common things that I've seen when someone is talking to me about why they've chosen to leave the company, is that their manager, when they tried to talk to their manager about certain things they needed or wanted in their work experience, the manager wasn't able to really hear it.
And I think that that is, it's a very sympathetic circumstance because the manager has got all kinds of pressures for stuff that they're supposed to do. So they're listening to what the person is saying, but there's this whole other voice of the pressures that they're under to get certain things done, and therefore they're trying to make the person... what the person is saying fit that mold.
Scott: Which when it goes badly, ends up in the person didn't actually feel heard or the steps weren't actually taken to modify the circumstance, and then the end result is the employee leaves. And that is just one. There's lots and lots of examples of people not really listening to people. It's pretty much if you can find a group of human beings, there will be any number of people not really listening to other people, but that to me is the full arc of the ability to develop that rapport. You used the word empathy. I think that's exactly right, and really see where people are coming from.
Brian: Sure. One of the things we talk about when we do design research is separating observations from interpretations, and so when we do a usability study, oftentimes we'll open those up to people that... like a stakeholder who maybe thinks, "Nope, this is how it needs to be. I know the sales team, blah, blah, blah." It's like, "Well, let's actually see if the solution works for them." So we're going to actually test out whether or not they'll use this new pricing model and whatever, and you try to get them to separate, "On this sticky note, I want you to write down interesting things you saw, and on the pink sticky notes, you can write down your interpretations of what they mean." But they're not the same thing and I think that helps with that filtering that you're talking about. It's also about taking your solution hat off.
When I conduct design jams with clients, it's like hats off. We're not here to build anything right now. We're here to just think really wide about like pie in the sky, moonshot kind of stuff. If we could do anything, what would we do if time and money and technology wasn't, without being ridiculous, but within some realm of like in the next five years, let's say, even just start with something like that, we don't even know what might happen in three years with some technology. So it's taking off the implementation hat, the solution hat, which model I might use. Oh my God, there's no way we have that kind of transaction data from five years ago. We'll never get that. Just forget it for a minute because you don't know where that's going to spin and lead you to later on.
Just, it's not so much that like, maybe that data doesn't exist and everybody knows it, but when that gets thrown on the table, it may generate some other idea from somebody else, which creates another tangent, and that's kind of the divergent thinking. I'm kind of diverging a little bit here, but that's my point. It's you might hit on something that's really interesting that is somewhat feasible, and you wouldn't have gotten there if everyone is still, like in your case, it's Echo Logistics, I know the data warehouse, I know the systems we have, and you have this fence around everything you do. You don't bust out of... You don't come up with creative stuff that way typically. You're incrementally moving, and that's...
Scott: Well, in that case you're just playing the chart as written, right?
Brian: Yup. That's right.
Scott: There's no jamming going on in that case. Yeah, it's interesting you say that about, especially about separating the observation from the interpretation. I really like that because I... one of the things I talk about is, a lot of people have a lot of different definitions of analytics. For me, I categorize analytic, the analytic pursuit into three fundamental activities. The first is to observe, the second is to relate, and the third is to predict. And I can generally put everything that we do into one of those three buckets or a combination of those three buckets. And the observation to me in particular, is very much informed by, well, it's to me, a cross between how a scientist observes things, and how an artist observes things. And if you had to say, "Well, what's the overlap there?" The answer would be carefully.
They're going to observe it carefully, they're going to observe it both... they're going to try on interpretation and without interpretation. Almost like taking on and off the glasses that are filtering how they're looking, and great science and great art have both come from careful observations of things. And then potentially adding, applying an element of interpretation on it as necessary, but it begins with the purity of the observation.
Scott: And so, to me and then that of course goes right into. Once you can observe that thing, then you start asking yourself questions about, "Well, when this happens, does this happen over here? What's the relationship between A and B?" And then ultimately, we're usually after, can we predict that if this thing happens, then this other thing over here will happen? That's the ultimate. But you can't predict what you can't relate, and you can't relate what you can't observe.
Brian: Right, right. Is there a particular story or project or something that you've done at Echo that where, maybe you came in and the status quo was X, and so, using some of these approaches, you ended up with Y? So maybe it was a redesign of some dashboard or some application, or something or the... maybe the data even stayed the same, but the way it was presented or used by the sales team changed, something like that where you applied some of these techniques that you could tell them [crosstalk 00:29:27] about them before or after?
Scott: Yeah. So, I have to be a little careful in terms of any sort of proprietary stuff, but what I can talk about, I think pretty easily, is when I first came to consult with the company, I was meeting one of the top executives, actually, my boss now, Dave Menzel, is our president and COO, and I was talking to him about analytic opportunities in the company, and I asked him what his thoughts were on various opportunities. He was talking about margin rate related to a particular area of our business, and I said, "Well, of course margin rate..." I came from this retail background so I thought, "More margin, that sounds great." I said, "What is your win rate right now in terms of when you have an opportunity to win and you make a pricing decision, what's your win rate?"
And he didn't know because the company didn't know. And I said, "Well, that's going to be an issue with messing with margin. If you can't understand the relationship between margin and win rate, those two things push in opposite directions. So in order to maximize your total dollars, you're usually making trade offs, like economics fundamentals, you're going to sell more of the thing if you sell a cheaper, you sell less of the thing if you sell it more expensively. So the question is what is the right balance that allows you to sell as much as possible at the best price you can get?" And that's a case where there wasn't adequate observation. So the first thing we had to do was design a system that allowed us to capture information that has not been captured before. And so we did that, and we were able to actually observe wins and losses, win rates and the conditions of the wins and losses and all sorts of details about those.
From there we could then go on to start drawing all sorts of relationships. And we've done a tremendous amount of work since that foundation of understanding price last to cities, and conditions of winning and losing, and conditions of where there's higher pricing and lower pricing and all sorts of really fantastic stuff, and then ultimately, it's led us to design some predictive capabilities that allow us to set the best price in the best situation or to make a recommendation to a sales rep or to our customer. And all of that was built on a foundation of beginning with an understanding... seeking to understand what was of concern to a leader, and then pulling on that thread and backing all the way up to the beginning to say, "In order to achieve resolving this concern, what will we need to be able to observe that we can't observe right now, and building that foundation from there?"
Brian: How did it...
Scott: It worked out great.
Brian: How did it... at some point though, your executive is looking for the outcome from the effort. And so, but there's human linchpins in the middle. In this case it's, I'm guessing your sales team or someone that's you would be dealing with if you're helping to set prices, for example. So how and at what point did you use some of these techniques to involve them such that they were on board with the big picture of what the project was? Because sometimes you hear this is where it fails. The executives set a strategy, but at some more middle layer of management, it's like, "Well, I don't really have time to work on this or get involved with it." And you're like, "No, we really need to ask you some questions about how you set prices when you do a sale." "Well, yeah, but I'm on the road this week but..." After you use some of these techniques to get them to believe the outcomes such that you could then go report back some good news.
Scott: Yeah, so everything you just said is all true, so let's just start with all of that complexity has been involved and there's been no small amount of campaigning on my part and the part of folks in change management, and internal communications and so forth, with the sales teams, with sales leaders on these topics. You hinted at the idea of proving that it's worthwhile. We did a pilot with one particular team on the West Coast that showed some fantastic success of using some of the tools we put together, and I've definitely shouted that from the rooftops as many times as I can, but envision-
Brian: But even right there, you just said you ran a pilot, right?
Brian: So that's the gesture that you went out and found you had some friendlies, I guess, as I call them.
Scott: Yeah, yup.
Brian: These kind of partners that are like, "Hell yeah, we'll try that. We'll try anything right now. We're not..." or whatever their incentive was. They were your friendlies.
Scott: Yeah, that's exactly right. And they not only would it... what was wonderful about that group was not only were they open to it, but they were actually an operations group technically, not a sales group, and what that really meant in practical effect was they were willing to follow an SOP. They're willing to actually follow a technique repeatedly instead of just shooting from the hip, which is both the blessing and the curse of your typical sales person. On the one hand, it's exactly what you need salespeople to do, and on the other hand, it can make it really difficult when you're trying to run controlled experiments. But the team was very disciplined about it and we were able to see absolutely outstanding results, and so, definitely shared that.
But what's also interesting is, and I'm sure that you know this, even after that, we did not have adoption of these techniques at a widespread level, because the hurdle was still too high from a usability standpoint. Even though I could demonstrate that using this reporting and using this technique would result in more money, it still wasn't getting the traction that it should, and so we had to take another step of the solution, which is, make it much, much easier and much faster for them to actually engage in that approach and in the intelligence supporting that approach. So we actually just launched, not long ago, a new technology set for salespeople that has dramatically lowered the effort hurdle that they had before, and now we're seeing that the adoption shift that's taken place in the last month has been pretty dramatic. It was like we finally got the last barrier knocked out.
So my point of when you opened it like, what about this element? What about this element? What about this? It's all this. It's true. You get the pilot team to prove that it works. You talk about how it's better for them, but what's in it for me, for the sales reps? You talk to leadership about how it's good for the company and how it'll make the company more money, you do all that. And then on top of all of that, we still had to solve for making it simpler and faster and easier, which we did. And now having piled all of that up, we finally have hit the tipping point and we're seeing mass adoption and it's taken a few years.
Scott: It's a... I would have loved for it to have gone faster, but this is a marathon you must win.
Brian: I have two followups to this. One, are you... and you can add some blurring to this just to ensure you're not putting anything confidential out there, but could you help our audience understand like what was an example of the thing that was blocking or creating usability problems and what did you do to address that? That's one thing. Well, let me tell you the fog, because you may want to answer this first. The second one is, do you think you could've done something about that earlier, or did you have to wait until that you were there to know that that was a barrier?
Scott: That's a good question. So I knew it was a barrier always. It's just what we could get done ourselves. So I view my team as an R&D function, and if you truly want to institutionalize at scale, the iKey team has to do that because they're the ones that own all of the core operational systems. And so to make it a little more precise, we had this report that we would email to our sales reps, as a pdf attachment. That is not ideal. Once you could send it once a week and it wasn't interactive, and so the use of it was limited. The upgrade is in two fundamental ways. One, there's a digital version, a website version, that's interactive of the report itself, and then on top of that, the other problem we had was that we didn't actually allow them... we'd come up with a great way to analyze what was going on with sales, what we hadn't done is allow them an easy translation to set their profiles, their pricing, their account information up in a way that aligned with the reporting.
And so even if they... so what that meant was, they had to look at the report every single time they wanted to do a quote. Now, they can look at the report, they can set their settings, and then they don't have to go and pull up their port every single time they do a quote because they've already used the reporting to set their defaults.
Brian: Got it.
Scott: So that's an example of what I mean. So we made it interactive, we made it fast, and we aligned the nature of the decisions that the report was guiding, with how they can actually execute their settings. So from a design standpoint, now we have total alignment between it, whereas before there were multiple issues. One, sending this clunky attachment to an email, two, they have to pull up this pdf every single time they're going to do something, and it didn't really align well with how they actually set their pricing in the system. So those were all elements that helped to turn the-
Reducing friction and drive adoption.
Scott: Yeah, exactly. Yeah.
Brian: How did you know that that was the pain? How did you diagnose that this is blocking the wider adoption?
Scott: A lot of conversations with sales reps and sales leaders was really the number one thing, and it comes back to that listening. I also knew part of it was, there's times I was talking about the trust equation, and the change, and the risk and all that sort of stuff, and there's times when you come with a new approach to things and folks are trying to shoot it full of holes. I had never gotten that response when I produced... when I was training folks on how to use this report. So no one was shooting it down in the moment, but then as we followed up and we do follow up phone calls or look at the results of what was happening in the business, we could see clearly that the reports actually weren't being used. So there's something else going on.
It wasn't that they thought the reports were a bad idea, it wasn't that they felt they were threatening, on the contrary, they thought they were pretty cool. So then you got to go, "Why aren't they actually using it?" And it comes back to many of the design elements that are not in your … well, it starts getting into the behavioral economics universe, Kahneman and Tversky and those guys, is to understanding the power of default settings, understanding the inertia, the natural inertia that most people operate under all the time, and as we started digging into that and figuring out how easy could we make it for them, we made it easier and easier and now we're getting very enthusiastic engagement.
Brian: That's great. I'm glad to hear your guys, so... to hear that story and to share it with our audience because I still feel like this whole concept that you're... like in this case, it sounds like the models and the analytics work were not the problem. So it's not, again, it's not always a technology problem. Even if you call the pdf and building a web based interface, technically that's... yes, it's technology, but that's not data science technology. You can get all that right, and if you don't deliver it in a way that someone's going to engage in this case, salesperson or OPS or whoever they were, you fail and then the whole initiative, definitely you lose credibility, it's harder to climb out of that hole the next time there's a big project. So, I'm glad you showed that.
Scott: Yeah. I think that's really true. And I... we also talked a little bit in on our preparation. This last mile element, it's not just about delivering your... one of the things I tell my team is that having the right answer doesn't mean you're done. It means you're just like halfway the test, because if it doesn't actually get believed and utilized, then it hasn't really done anything. And I'm actually in the process of lining up a role that I haven't had before on my team, called an analytic translator.
And this is a role I hadn't really given much thought to until I read the recent Harvard Business Review article about an AI powered organization and they mentioned this role and I got to thinking about it and I thought, that's actually how I spend a tremendous amount of my own time personally is explaining how models work or explaining... or listening to how the business... what the business needs are, translating that into analytic approaches or to predictive models that our data science team builds, and then encouraging adoption and use and explaining in a lot of cases, many, many times to many, many different audiences, how it works, how it's aligned with their worldview.
And what I realized in terms of... as we continue to grow our analytic competency as a company, that my time doing that stuff was becoming a bottleneck and I needed help. I have certain expectations that our data science and analyst folks interact with the business. So I don't want them isolated and not talking to people anymore. But the reality is that I do think that there's only so many hours in a week that I need to have them doing the communication stuff and they need to be spending a lot of their time continuing to develop the insights and the models and so forth. And so, we're at a point now where from a scale and from a pace of change standpoint, it made sense for me to get another role that's got... that's this providing help in that respect. And honestly, that's the first time I've had that kind of role on my team, but I think it's a... I'm very excited about it and think it's going to be a great value for the business and for continuing to advance the analytic cause here.
Brian: Great. Yeah, I'm definitely hearing more about... I have my issues with the title. I feel like the title is not ideal, but we don't always control those things, but I think the nature of the work seems to make sense for some of the pains and problems that seem to be ongoing. So, I can understand how that would be valuable to you guys. Scott, this has been super great. I've really enjoyed talking to you and some of your stories here and how you're approaching this. It's been fantastic. So, I did want to ask you one last thing though. Do you have any closing advice for people in your shoes, people trying to make sure their data products and solutions are creating value, any closing thoughts?
Scott: Oh, I can give you my universal career advice.
Brian: What's that?
Scott: It's two steps. Number one, do good work. Number two, tell people about your work.
Brian: It's so simple.
Scott: It sounds so simple. How many people do you know that actually do both of those things?
Brian: Some people tell you about their successes and maybe they [crosstalk 00:47:29] though.
Scott: Yeah. Right. So, that doesn't sound too tongue in cheek, let me dissect that just a little bit. When I say do good work, I mean find a way to work on stuff that matters and do it well. So I don't just mean, do the work you're assigned well, I mean, seek out work that matters.
Brian: Meaningful work.
Scott: Try to avoid the work that doesn't matter. And then on top of working on important and valuable work, perform that work well. So that's the first piece. The second piece is informed very much by my time, almost a decade, in the great state of Minnesota. We have a very soft spot in my heart for... there's a lot of people there that are just wonderful that I care about. And I worked at Best Buy and it was a wonderful company that I loved, but it was a little bit hilariously true to the stereotype that they didn't ever want to make someone feel bad, so there were all sorts of weirdnesses about people putting their names on PowerPoint decks and stuff because they'd say, "Well, I don't want to take credit for that. It was really a group effort."
And so one of the reasons I came up with the second statement that says, "Tell people about the work," is that if you are actually doing important and interesting work and you're doing it well, then first of all, I find that people are interested in hearing about it. But the second thing is, I'm not saying tell people about yourself. The fact that you were involved in the work is self evident, based on the fact that you're exposing it. Talk about the work. And then it's not about you, it's about the work. And it helps sidestep that issue that some people have of they don't want to... they feel uncomfortable tooting their own horn or sort of. But if they've done something exciting, it can be pretty easy for a person who doesn't even like to promote themselves to be excited about the cool work that they did.
And so we do stuff here at our company where I put together events where I try to give folks an opportunity to do that. We just did... well, a couple of years ago, we did something called Datapalooza, where we had all these folks that had learned SQL code that had built these projects that were incredibly cool, and we did a sort of science fair style. We had coffee and donuts in the cafeteria and had everybody set up their trifled posters where they explained what they did, and those sorts of things are sort of, they're not that hard to do, and yet they don't happen nearly as often as I think they could. So that's my closing advice. Do good work and tell people about the work.
Brian: Was there a volcano at that?
Scott: Someone did bring a volcano to our last one and it was totally awesome.
Brian: I love it. Where can people follow you and pictures of the volcano perhaps?
Scott: I don't know if I... well, I'll see if I can find a picture of the volcano. They can follow me on LinkedIn. I'm very easy to find on LinkedIn. Scott Friesen, last name is F-R-I-E-S-E-N. And, yeah, I think there's some YouTube videos around and stuff with some talks I've done. I keep telling myself I should write more things, but then I don't write them as often as I'd like to, but I'm-
Brian: Can I remind you of part two of the career advice?
Scott: I know. I do a better job inside of my companies than I do in the world at large. I should work on that version of it, that part two.
Brian: I for one would welcome reading more, so please do share more of your thoughts, and thanks for coming on Experiencing Data. This has been super fun.
Scott: I'm so glad. Thanks for having me, Brian. It was great.