014 – How Worthington Industries Makes Predictive Analytics Useful from the Steel Mill Floor to the Corner Office with Dr. Stephen Bartos

Experiencing Data with Brian T. O'Neill
Experiencing Data with Brian T. O'Neill
014 - How Worthington Industries Makes Predictive Analytics Useful from the Steel Mill Floor to the Corner Office with Dr. Stephen Bartos
/
Steve Bartos

Today we are joined by the analytics “man of steel,” Steve Bartos, the Manager of the Predictive Analytics team in the steel processing division at Worthington Industries. 😉 At Worthington, Steve is tasked with strategically driving impactful analytics wider and deeper across the division and, as part of this effort, helps ensure an educated, supported, and connected analytics community. In addition, Steve also serves as a co-leader of the Columbus Tableau User Group.

On today’s episode, Steve and I discuss how analytics are providing internal process improvements at Worthington. We also cover the challenges Steve faces designing effective data-rich products, the importance of the “last mile,” and how his PhD in science education shapes his work in predictive analytics.

In addition, we also talk about:

  • Internal tools that Steve has developed and how they help Worthington Industries.
  • Preplanning and its importance for creating a solution that works for the client.
  • Using analytics to inform daily decisions, aid in monthly meetings, and assist with Kaizen (Lean) focused decisions.
  • How Steve pulls out the meaningful pieces of information that can improve the division’s performance.
  • How Steve tries to avoid Data-Rich and Insight-Poor customer solutions
  • The importance of engaging the customer/user throughout the process
  • How Steve leverages his science education background to communicate with his peers and with executives at Worthington

Resources and Links

Twitter: @OlDirtyBarGraph

Steve Bartos LinkedIn

Quotes from Today’s Episode

“Seeing the way analytics can help facilitate better decision making, doesn't necessarily come with showing someone every single question they can possibly answer, waiting for them to applaud how much time and how much energy and effort you'd saved them.” - Steve Bartos

“It's hard to talk about the influence of different machine parameters on quality if every operator is setting it up based on their own tribal knowledge of how it runs best.” - Steve Bartos

“I think bringing the question back to the user much more frequently, much sooner and at a much more focused scale has paid dividends”. - Steve Bartos

“It's getting the people that are actually going to sit and use these interfaces involved in the creation process… they should be helping you define the goals and the problems… by getting them involved, it makes the adoption process a lot easier.” - Brian O’Neill

“It's real easy to throw up some work that you've done in Tableau around a question that a manager or an executive had. It's real easy to do that. It's really difficult to do that well and have some control of the conversation, being able to say, here's what we did, here was the question, here's the day we use, here's how we analyze it and here's a suggestion where making and now let's talk about why and do that in a way that doesn't lead to an in-the-weeds session and frustration.” - Steve Bartos

Transcript

Brian:  So excited to have Steve Bartos here. Steve, what's happening? You're the steel man.

Steve:  I am the steel man. Not a lot is happening. Actually, we're getting ready to move over to a corporate function. So, I may be the deal, pressure cylinders and engineered cabs man but for right now we'll keep it Mr. Steel analytics.

Brian:  That's awesome. So, Steve, correct me if I'm wrong, you manage the predictive analytics group at Worthington Steel out in Ohio. Is that right?

Steve:  That is correct.

Brian:  Nice. So, tell us a little bit about your background. Like what are you doing with Steel? What are you helping your colleagues do with steel is probably a better question.

Steve:  From our team's perspective, we currently have one of our uh we're a very small team right now. There's three of us in the steel processing division. One of our folks is a sort of a wizard around forecasting. We have what we call our crown jewel of analytics in our automotive forecasting world and she's been the keeper of that, the improver of that, the nurturer and the cultivator of all things. For our automotive group, primarily about 50% to 60% of our business, so that's a model that was five, six, seven, eight years ago and she rebuilt it. That's one area. The second area is another member of our team has been working around pricing. We call it a price elasticity model, but our business is typically, it's more of a bid response, folks send us quotes. We try to fulfill those quotes.

Steve:  We see what we win and lose and how we may be able to improve our pricing based on certain parameters. And I've been working primarily for the last year or a year and a half with initially our folks in quality, so, sort of getting to our shop floor analytics. We have vision inspection systems on about five of our lines. So, how do we take those data and make them more actionable first for our quality group and now we're finding additional applications and that led to what we like to call the birth of big data at Worthington Industries.

Steve:  So, how do we take some of these very old, very expensive operations, machines, like a rolling mill, basically reducing the gauge of steel. But, currently we have some software that's wrangling 500, 600 machine parameters and how do we take that and surface that to our operations group, our quality group and our maintenance group to help them make better decisions. But again, that's in the infancy and I think we'll probably touch on some of the challenges around that in some way, shape or form in this conversation. So, that was a quick overview, the quick tour.

Brian:  Cool. So, at some point when we were talking about having you on the show here, you had mentioned data rich and insight poor. This is obviously a common thing kind of with people dealing with analytics. So, how do you make sure that you're not insight poor? Obviously you're trying to give people what they need and help them figure out how to get more value out of their work or for Worthington, but at the end of the day, I imagine that means helping someone set a value here. Tell someone what a price range should be or whatever it may be. How do you do that? Like, what do you do to make sure that you're delivering value to them and kind of in that, what we call that last mile, right?

Steve:  No, absolutely. That's where ... and there's probably a handful of examples I can give, recently with the operations folks. So, if you take this tandem mill and they're managing a team, that day to day you know let's say shifts a day, they're trying to hit certain targets. Hey, we're running this material through this operation, we need to reduce it to X width. We then have operators running their what they consider best practice, running their material and they get some indicator of how well they did or they don't get an indicator of how well they did beyond just you know a quick gauge chart. And the challenge is, first, it's impressive how well they're able to run such a complex operation. If you consider you know 500 or 600 machine parameters, you have all the inputs of the material that we're running, where we sourced it from, how it's shaped, chemistry and their ability to be able to do that well is extremely impressive.

Steve:  But where we get to the insight core is we have operators, and it's surprising to me, I've been with the company for four years, but they'll say, 'Hey, we would like to out to see how performing hour to hour, shift to shift, see how this operator does against this operator, see how our performance has changed now that we are sourcing our raw material from a different vendor. What I would consider fairly fundamental questions, nothing predictive, nothing prescriptive, very descriptive and they don't necessarily have that ability or that ability as a function of maybe a local electrical engineer or someone that's tasked with managing their level two systems and maybe they know a little something about how to pull that data out of those systems. In one case, we have a gentleman that's his goal to learn more about R and he's throwing it into some shiny apps for some very specific questions, but broadly, it's that gap of folks managing their day to day business.

Steve:  There are instances where it's better than others, but that's really where you look at the amount of data that we're generating, what percentage of that data are we acting on? And maybe it's simply at you know the HMI of an operator and the percentage of that data that we're capturing and then a percentage of that that we're strategically connecting with other data and or connecting it to the questions that the folks on the shop floor actually have. So, that's been eye opening, tremendous opportunity. But, the challenge is how do you navigate that landscape? Like you said, that last mile to really understand what is it they're asking? What is it they need? And being able to deliver on that in a way that they can take some action or that fits with their strategic goals or their initiatives or what they feel is important.

Brian:  So, I'm curious, how do you go about figuring that out? You've got the sales side of the business and then you actually have more of the machinery and the people, right, that are operating the machinery to create the products that you then eventually sell. Do you use a different process because there's different end stakeholders? For example, are the people that are on the shop floor, are they curious about their own work and their callings or is it more like a manager of this group who may not actually be on the shop floor or day to day is curious about team performance and you know is the way Jane does step five on the mill you know actually better than everyone else? Maybe we can use whatever Jane's doing to help everyone else on the team run step five better because she's figured out some way to do it and is it more of a managerial thing or is it actually the end users themselves running this machinery that you're trying to help?

Steve:  I think it's pieces and parts of both. My initial response when you began posing the question was around the exposure that at the manager level is let's say it's an operation manager who would oversee all the operations or maybe it's a manager of a specific operation at the facility. Their exposure to data previously, what that is sort of shapes the narrative and it shapes the expectation and what they feel can be done with the data and how the data can better serve them. I think from my side, the challenge is different in what I may think ... I guess it's orienting yourself to the needs of that person, whoever that person may be and not getting carried away with what I feel the best application is, what I feel the best, whether it's a dashboard, an analytics decision support tool, whatever that may be.

Steve:  But I think understanding that what we've learned is folks want to see the data, like shortening that iteration cycle and not being as aggressive right out of the box has served us well as of late, because we've learned the lessons of going away and maybe spending time with a single subject matter expert and catering to their wishes and what they feel is valuable and then you again, whether it's up the hierarchy or down the hierarchy, you start serving it up. Really, what it comes down to is something I just want to see how we did by hour, by shift and that's great that you want to head and correlate some of these machine parameters with our performance and starting to show where ... when we use standard deviations, we tend to drift from our target. That's all great. How did we do an hour ago? Can I just see that? Can I just get some fundamental understanding of the performance of the machine around some of the variables that I think are key? So, really walking that back.

Steve:  It's no shortcoming. The acceleration tends to be greater when out of the box we maybe come a little more fundamental and just meeting them where they're at and delivering up some simple insights or even just some, like I said, this descriptive visualizations of the performance. I think that's been something that we've learned. It's been reinforced here for folks who have never seen their data. Sometimes it's let's take a step back, let's slow down and really meet the consumer where they're at to help them ask more questions, different questions, questions that we would see progressing to from rear view mirror to sort of the current state and from the current state to you know the world of the possible.

Brian:  Right. So, I'm curious, you mentioned like something about going in too aggressive. So, was there a learning moment, a teachable moment, a story where you found yourself like maybe you kept your team in check from the way it used to design these solutions and instead you're like, hold on a second, they don't need that entire world. They really want to just understand like what was the tolerance range for whatever value between 12:00 and 1:00 earlier today. Was there a moment where you changed a process or you learn something about designing that solution for the end user that's different than how you were doing it before?

Steve:  Yes, and I think that was driven in large part by our I guess we became a little more mature or became a little more turning the microscope on yourself and that's looking at how of our tools that we've released, let's say we're using Tableau and we actually built like a performance monitoring tool. Who's opening these dashboards? What pages, what views are they going to? How frequently are they going to those? And I referred back to working very closely with a single subject matter expert on a certain project and he's fantastic. Super bright guy, very analytical. He's on the quality team. He oversees our vision inspection system tools we were building. So, we just set up a data lake and he's 10 steps ahead of everyone else in terms of how we could use the data with that group. And so, we go from simple performance monitoring and say, "Hey, let's see how our suppliers are doing. Let's see how we're ... and just all these iterations, all these questions, different types of questions and clustering different dashboards in different ways, way out in front of it.

Steve:  But, then when we step back and we built the tool to actually monitor performance, what you see is a handful of folks migrating to a single view, just that single performance view, I guess that reinforced from my standpoint is meeting the user where they're at, being a little less aggressive about answering every question out of the box that you may potentially have and being more aggressive about a single view, a single question, just those tighter iteration cycles. I felt like we built some great tools and slowly month after month, we start seeing the need or the pull from the business as opposed to us pushing out six different tools with multiple views. You started saying, hey, so and so from a certain facility starting to see a quality issue related to what they think is a change in sourcing, oh wait, we already had this tool built because we built it six months ago, but no one was ready for it. Seeing the way analytics can help facilitate better decision making, doesn't necessarily come with showing someone every single question they can possibly answer, waiting for them to applaud how much time and how much energy and effort you'd saved them.

Brian:  Right.

Brian:  Sure. I'm curious about the SME that you're ... I call them SME. The subject matter experts.

Steve:  It's actually his first name. It's very ironic.

Brian:  His name is SME? Wow! Kind of actually like, wow, that's pretty good, that's pretty good.

Steve:  That's very good. Very ironic.

Brian:  But I'm curious, is SME your customer or is SME translating requirements on behalf of someone that's actually opening up Tableau or whatever software you're making? That's what we would call a proxy, right? In the design standpoint, it's not the horse's mouth, it's kind of a proxy for it. And so, especially with someone that might be able to see that 10 x world of what could be possible with all the data that's there versus someone that's actually going to pull the trigger on an insight that was gleaned and do something with it, because ultimately all this stuff doesn't matter if there's no trigger is pulled at the end of the day. So, how do you get access to that end user or are you really building this for SME, or whatever his or her name is?

Steve:  Yeah, I think I think in the ... he's also been the, I guess in part subject matter expert in sort of the, almost like a quasi product owner, is we interact on this recent project with the tandem mill team, the team that is in charge of the operations of this rolling mill. So, he's in part consumer because as a member of the quality team, clearly the performance of that know is something of interest to him. But, he's helped and I think we've had these conversations around, wow, we came out of the box really aggressive with that previous project and we joked about how, I guess the analogy I would make was, anyone that's been any time building any visual analytic tool, and we just had a conversation around this with a group a little newer to Tableau, but how it's that last mile, it's that last 1% that you have to get right if you want it to resonate, even if you're just presenting a cursory first cut of an analysis, how you simplify that and make that more focused helps drive the conversation, helps drive the person that's never seen it, right?

Steve:  So, you spend hours and hours and hours building and you're going to open it up to someone first time view, you've got to get that right. And I think we joked about how this is what he thinks about all the time. He's looking at a view that maybe I built for him and thinking about the ways we can iterate on that and improve it. So, he's sort of serving that overlap between the end user and the subject matter expert and sort of a liaison for us. The end user has never seen these tools. They're thinking about the issue. The issue is, hey, we're having some performance issues and quality issues. They haven't thought 10 steps ahead about how we're going to build a tool. It's going to be directly how I can impact that. So, I'm not sure if I'm answering your question, but I think he has helped guide us through his understanding of the process of that world. When I think what we've both agreed on is still having those touch points with really the analytics novice, a novice consumer is important if we're going to get it right out of the box and start creating that pull.

Brian:  Well, ultimately, again, the last mile, if if we have data lakes and we have predictive analytics and machine learning, we have all these things, right? And at the end of the day, you're probably going to spit out a pdf, a web page and maybe a mobile application, and it all comes down to that moment of did that person on at the last mile understand it, find it insightful and then pull a trigger and do something about it, if it's that type of analytic exploratory ones too. But, it sounds like you're kind of more in that declarative ... like people have specific questions in mind that they need to get answered. So, I'm curious, did you guys ever get involved with like showing them low fidelity sketches or like bring them into like a white boarding session and say, hey, would this type of analytic or would this type of chart, or whatever it may be, help you answer that question that you have prior to building anything or do you guys typically wait until you actually have some kind of prototype or have in Tableau or how do you know ... Is it kind of like you're not really sure and you build, then you show it to them and then you wait for that feedback and then you iterate from there or?

Steve:  No, that's a great question. I think when I talk about our change in approach, we have tried to pay much more attention to that critical, like that pre-planning stage. Let's really drill into the question. Let me really understand what, when you say, can you help me with x, let me really understand what that is and let me try to break that down, let me go fully down to like a MECE single question, mutually exclusive. If you get there, great. But, really understanding the need and the question and then what we've tried to do, whether we storyboard it or we have representative work that we've done before and as they're saying, hey, you're going to surface these 90 machine parameters, can you help us tell which ones we have to pay attention to? And then, we, hey, I heard you say that, this is what we did with the maintenance group on this current project was, we had built something that we felt could be applied in that way and we pulled that up and said, let's talk about this. How could we you know shape this view to meet the need around that question?

Steve:  So, doing much more pre-planning. Again, that's something we just talked about today in a meeting was, before you're in there dragging and dropping, really understanding the question, really understanding everything, the constellation of other ... whether we call them filters, but what else could be coming to play on this situation that you're going to need to filter out as far as the signal from the noise, but then iterating more tightly. Even once you think you have something, coming back and saying, hey, here's what I'm getting ready to carve out. Does that seem to apply? What am I not thinking about? And that's where we'd use that subject matter expert at times as a proxy, but sometimes I think bringing it back to the user much more frequently, much sooner and a much more focused and scale has paid dividends.

Brian:  Right right. Are you guys physically located in the same place such that you can have ... like when you've had these encounters, are they over the phone or do you guys actually get together?

Steve:  So, our corporate headquarters our IT headquarters and then this facility are all located here in Columbus. What we have worked with our BI group on developing and they've done a fantastic job, was we're big in the transformation and lean and we have kaizen events which are these week long events, everyone gets locked in the room, wherever that may be to start solving business problems. They've taken a similar approach with how they develop their a BI dashboards, particularly the role based dashboards. So, they'll start 30 days out and get, let's say it's a group of supply chain managers and get that group maybe initially on the phone 30 or so days out and have some pre-work around understanding their challenges, their opportunities, pain points and then working through the specific questions they're trying to answer, trying to cluster those questions maybe around, hey, these are your questions about our suppliers, these are questions about you know our whip and our velocity through our plan. These are our questions about our end customer and then using that to start white boarding with some of those folks.

Steve:  So, you go into a week long event and you're, for most of it, you're iterating on mock ups, whatever you want to call them, stuff that's already been built and maybe you have a touch point in the morning and you're basically building views in that. The end of the week should be, hey, we finalized some views, let's let folks go out and beat it up for a month. We'll funnel any requests through like the dashboard champion or if there's multiple views, it might be, hey, here's our folks for the raw materials side. They're going to be a champion and any requests are going to funnel through them. And then hopefully at the end of 30 days, you have a finished product, dated 30 days before, a week's sprint 30 days after.

Steve:  Now we start monitoring performance and having 60, 90, 120 day check in and monitoring where we where folks are using it, where they're not using it as a training issue or a design issue. In one piece they've added to this, which I think we need to do a better job of. That's a great we call them going to the gamba, understanding the need of the user, in doing that as a follow up, how are they using it? Where do they see ... where can we improve upon it? Sort of like the day in the life. Well, there's a day in the life before to understand how the folks are making decisions but then there's also a day in the life to see how the tools are being integrated into their role and responsibility and where there may be opportunities for improvement.

Brian:  Got It. So, it sounds like you guys have a tight feedback loop there involved both in the design process as well as the kind of iteration that happens after you have a first living thing. A tool that's actually you know theoretically is working and providing it's insight. So, that's cool. It sounds good that you guys are out there. For example, the mill operators. So, what is their environment like? I'm picturing molten steel and a laptop out. Are they going back and forth between an office and then a kind of an industrial environment? Do they access to screens and reporting near the tool? I'm wondering how tightly coupled when someone's saying, I need hourly analytics or it's like practically real time or just barely lagging real time. Are they wearing gloves? Are they using a tablet? What's that experience like?

Steve:  So, currently, what I've been talking about a bit, it's just like that performance management piece. They are not making like tactical hour by ... real time decisions from the analytics teams tools. That's the desired end state. So, I mean, at some point, that's not a desired end state I guess, but that's certainly an avenue that we see great opportunity. Currently now it would be more around like that performance monitoring piece. So, maybe, for example, we've been having a conversation around operator and operator performance. So, being able to look at, hey, when we're running this thing ... We call it finished good be like a skew, we know what the target is, the target is x, but it looks like when Bob sets the machine up, he sets it up a certain way. Bill sets it up another way and Joan sets it up another way.

Steve:  Having the data which we've recently done, having the data to start to normal or control for operator variability, hey, why are we setting this up differently? And having the data have a conversation with it, which is something that we just did with it, and the group of all the operators and they were looking at the highest earning finished goods and how we set those up and why people set it up differently and then pulling up data and having that group collectively around the data say, based on the data, let's set it up this way. And then monitoring the performance and then if it doesn't perform well, making a change, what you've done is then stabilize the environment. So, it's hard to talk about the influence of different machine parameters on quality if every operator is setting it up based on their own tribal knowledge of how it runs best.

Steve:  One of the first challenges is let's take all the finished goods, let's understand how the machine should be set up and then we can look at when we start deviating and is that something about the incoming material or is that something about the health of the machine? But, that's where our subject matter expert has been fantastic is starting to use the data to drive that conversation and what we did was we take ... you think of all these veteran operators that believe that their way is the best way. I mean, we thought for a while, it was a week long event, are there going to be fist fights? Are people going to take this very offensively? I quit my way and then what they said about the end was, well, you showed us the data, you showed us how we all set it up, you showed us ... we changed a lot more than we ever thought we'd change, and you showed us how we performed. So, it's kind of hard to argue.

Brian:  Sure. I think what you're doing, like I mean, like from a design standpoint, especially if you're developing internal tools, right? It's getting the people that are actually going to sit and use these interfaces involved in the creation process, minimally from a research standpoint to understand the problem. But, the more they understand the goal and ... well, they should be helping you define the goals and the problems. But, by getting them involved, it makes the adoption process a lot easier. I hear a fair amount about low engagement with analytic services as being a problem that product managers and data product managers, people that are overseeing it, analytics groups see and a lot of it's like, well, did you get the end user involved, whoever that is and how closely and how tightly were they involved in what you're creating? Because they're going to believe it more.

Brian:  If they understand how it's going to benefit them or it's a collaborative process like in your example where, I don't have an agenda to set it up my way, I just think my way is better, but if it's a better way, everyone's open to it when they understand the goal collectively here is to optimize the output of the mill, and they can see how you're doing it, it's probably a lot easier to get them to engage with you, provide you good feedback, and they feel like they're part of it. So, I think that's really cool that you guys are doing that.

Brian:  Does Larry, Moe and Jack or whoever these operators were, do they move between the floor and like an office cubicle? Are they kind of back and forth? I'm curious kind of about that actual experience of using these analytics that you're providing. When does that happen?

Steve:  Well if you could envision something like that, someone sitting in like a small office at the facility who's in charge of managing the performance of that specific operation.

Brian:  I see.

Steve:  And then you'd have folks out on the line actually you know day to day, working their shift on the line. But, two things, and this is another area that we'd like to ... so, we had ... they have daily, they call it gemba walk where you'd go into a performance board and talk about issues and talk about your performance. We built a tool, like an electronic version of that in Tableau, which provides us some information. It doesn't provide all the information they typically cover, it's like a daily standup meeting for each shift. And then they also have, whether it's a weekly, biweekly, monthly, like you know team meetings where they've begun to use data in those meetings.

Steve:  So, when I talk about ... it was a set point. So, how we're setting up the mill where we finish it. So, there was a kaizen event, then they want to use the sort of the performance boards we've created of, are we improving our performance by making this collective agreement on how we're going to set up? They didn't want to use that in those meetings to help assess the performance. So, they're not quite in front of all operators all the time, but there's various touchpoints with various frequency and various depth of analytics or whatnot that's being presented to them.

Brian:  Got It. You're actually working it sounds like in this predictive analytics space and you said you're changing and I'm wondering, does the fact you're going to be maybe looking less rear view and more future, are you going to change something about how you design these solutions for the end customer as a nature of being future looking?

Steve:  The change for us is having a centralized corporate analytics function, working in tighter conjunction with our IT and our transformation teams. That's the big change. You know I talk about that analytics or the automotive forecasting model. I think when you look in other areas, I don't think there'll be a change. I think there's still that need to understand the need of your end user and iterate around them, orient them to analytics, whatever way shape or form it's being presented or whatever the tool is, whatever the analysis is, I think there's still that same need and I think there'll always be a challenge of, there used to be a conversation at you know managerial C suite about like where we need more predictive analytics, we need more predictive analytics. What really, what you need is to be able to answer more questions with the insights you're gleaning from the data that you're collecting. Right?

Steve:  The values in the data, what are all your valuable data assets that you're not answering questions with? How do we get people answering more questions that are valuable and then answering, as you move yourself up the hierarchy from descriptive to predictive to prescriptive, yes, you'll be gleaning more value, but sometimes for the user or for a first step is just can we just show them the data and give them some insight into their day to day operations, into their world? Right? It's like having the blinders on them. Yes, eventually we'll get predictive and prescriptive, but I think from an analytics team, it's still, are you answering ... are you meeting the need? Are you answering the question?

Brian:  Right.

Steve:  It's really like strategically I mean here's been a push or there's been continual alignment on making sure that we are working on, and by we, I mean analytics, IT and transformation are working on the strategic objectives of the organization collectively much better in multiplying, sort of like the one plus one plus one equals five. Multiply the impact so we're not all working in our own silos and we're not working on a problem that may be a problem for a single business unit, but strategically there's higher value targets.

Brian:  Got it, got it. Are you getting, from senior management, do you get this, we need to do predictive analytics, we need AI, we need machine learning, like kind of the leading ... the solution leading in front of the problem and then they're ... are you getting asked that and then you're kind of doing a coaching about, well, what problems do we want to solve and what might be possible with the data sets that we have? Is that an education that has to happen?

Steve:  Yeah, I think senior leadership may get ... you report out to the board and the board's talking about everyone's doing digital and AI, what are we doing? Let's get a strategy around digital. And what we did, and this is from the steel perspective, was start, hey, let's look at our strategy deck or strategy slides for the fiscal year and strengths, opportunities, weaknesses, threats, all that good stuff, and let's pull out all the themes and where we see themes emerging out of those and let's look what we're currently doing in the world of digital, right? It may not be, like I said, the Google glasses and the exoskeleton of the operator being controlled by an AI algorithm processing you know terabytes of data per nanosecond or whatever. But, where are we actually doing, quote unquote, digital right now and where do we have an opportunity, short term, mid term, long term to continue to leverage that or add it to our strategic goals? I think that conversation has helped both A, convince people that we're certainly not doing nothing, but also, hey, there's some opportunities we have if we align strategically where we could be doing more. But, if you really want to move the needle, how do we align? We've been calling it the three headed monster, IT transformation and analytics, where do we better align and then make an appropriate investment to, I guess, capitalize on all that opportunity?

Brian:  Got It. Am I correct that you have a PhD in science education?

Steve:  You are correct. Thank you for that. I think one of the big changes, I want to start making people call me doctor.

Brian:  I was curious, do you leverage anything from that background in that process of educating? Because something that you hear is, as you go up the senior ranks in a lot of businesses, especially non-technology businesses, you know the board members and the senior leaders, AI is still the terminator. The possibilities of general intelligence. They just oftentimes don't have a really good grasp of what's currently feasible and how you might want to start with like a really small problem, like, hey, maybe we can predict when this part is going to break on this mill and it's not sexy, but we're going to start with that to show that, hey, it's an expensive part. If we can get in front of it, we can save x dollars and it's a nice increment of technology to build that has a definitive value. Is that ... do you leverage any of your science education background to communicate upstairs some of these types of things?

Steve:  You sort of hit on the answer. I mean, outside of a background that included quantitative, qualitative research methods and techniques and statistics and whatnot, I think it's the communication piece, you know having spent, whatever, 17 years teaching, I think the importance that I've learned to place on communication, and it's similar to the conversation we were having earlier around, how do you take that last mile, right? That last mile is being clear and concise in communication and being able to coherently map out ... like a conversation through a presentation. You're gonna walk management through, A, at the Columbus mill, what we started out with was, how are we doing? How's our mill doing? And then into just some high level associations that we're seeing between machine parameters and performance. And then next, talking about, well, here's how we think we can use some algorithms to understand those predictors of, let's say quality issues, right?

Steve:  So, we see some associations here. Even if it's through our dashboard, what we're starting to see, when this is high, we have a quality issue. But, in communicating that, well, that's not a univariate relationship. You're not gonna find one machine parameter related to a quality issue. We're going to need some advanced math. So, then what we're trying to do is understand the multivariate relationship between machine parameters and product parameters and operator behavior. But, in the end, what we're trying to do is, A, understand when we produce a quality product, but ultimately not produce it. So, I think if ... for that communication piece, we've mapped out, that's our roadmap. Here's where we are now, here's where we're going and here's how we hope to get there. I think that to your point of what do I use, I think it's really that communication and being able to clearly and concisely, whether it's through a strategy slide deck of where we're going with this project or I think the same thing applies to a visualization.

Steve:  It's real easy to throw up some work that you've done in Tableau around a question that a manager or an executive had. It's real easy to do that. It's really difficult to do that well and have some control of the conversation, being able to say, here's what we did, here was the question, here's the day we use, here's how we analyze it and here's a suggestion where making and now let's talk about why and do that in a way that doesn't lead to an in the weeds session and frustration. I think that is ... and being very cognizant of how much planning and attention that takes before the meeting, which is similar to another point you made, which was around how do you get better and how do you understand the needs of the user, like understanding how important that conversation is and asking more questions. I haven't done a good job of this in this interview, but asking more questions and doing more listening before you do anything is a similar lesson that I've learned, which is, what you try to do with ... when you're trying to teach someone something, it's really easy as a teacher, for me to tell someone something, but having no understanding of what that person brings to the table as far as an understanding or asking how they're making sense of it is …

Brian:  Sure.

Steve:  It's a really long answer to that question. Did I get that...

Brian:  No, that's interesting. The communication thing. I think you're right that the listening part, when we talk about doing user experience research, you know sometimes we talk about listening. When you're doing qualitative interviews with people, it's heavily about listening and when you're doing usability testing on a prototype or a design, that's actually a lot about watching what people actually do and seeing it. But, the point is, is that you're taking in information a lot more than you're putting out information there and you're really there to feed the questions and try to get them to expose what's going on in their head. Because the problems are not always explicitly stated. Sometimes they're not using the same language for it. So, I feel like our job is, as designers and people that create, theoretically, we're creating solutions, right? If we're creating solutions, that means that we have fallen in love with those problems and we can articulate them very clearly. And that we have that ... we have a model of what's in their head more than the model that we may have.

Brian:  You're not alone or in that ... if you've had that experience where maybe you've built first and ask questions second, I think a lot of software has been built that way in the past. While it is, the iteration cycles are shorter and it's possible to get working prototypes going a lot more quickly. It's also you have the temptation to not want to rebuild. No one wants to rebuild, right? So, no one wants to go back and work on it again. So, you tend to fall in love a little bit with your own stuff and you're less ... there's a natural ... I mean, it happens even with me with design. It's like I try to stay low fidelity early in order to be able to throw things away that aren't right. Because the higher the fidelity you go early, the more you're not gonna really want to redo it. So, you have to be aware of that tension. But yeah, so, I think it's cool that you guys are going out and trying to really get to the heart of the problems before you deliver answers and solutions and software and reports and things like that. So, that's great.

Steve:  No, I like your point about you falling in love with their problem, with their issue as opposed to falling in love with the first solution you provide them and I think at the heart, as it does with most things in fact what do I use and what did I learn from my previous experience? It really comes back to trust, right? That they trust that you are there to solve their problem. You're not there ... You can see it when you present. I'm just thinking ... we tend to interact with, hey, here's Steve's, I'm going to show up and I'm going to show you what I've built for you so far and there's maybe anywhere, one to six people in the room and you just get that, eh, yeah, what's that again? Then you don't want them to be hesitant about saying, I don't get it. This isn't working for me or this isn't right or this …

Steve:   You want to have that open dialogue and have them trust that you're there for them and have them feel comfortable saying no. What I really want to get at is this, and having that ... which I think can be challenging, particularly until you've established a relationship. Some of these folks, you start working with them and you start understanding, yes, I'm here to work with you, I'm absolutely here to do everything I can to make your life easier and make you a more valuable employee, help you deal with the stuff that your subject matter expertise should be used for. I think until you get there, that's a great point, you need to fall in love with their problem, go low fidelity and be able to show them you're willing to make changes or do whatever it takes to get to that solution that they feel is aligned with what they were envisioning, asking for, solves a problem.

Brian:  Right. Cool man. This has been a lot of fun. Is there like a takeaway or something from like working in your industry and ... well, specifically in your focus on analytics here, but something that you would feel is a key takeaway from like kind of thinking about this last mile? Like any guidance or wisdom that's just come from your experiences at Worthington?

Steve:  I'm not sure it's wisdom. Again, every time ... when I think of it, I think well, it's like it's obvious, right? It's always obvious once you've learned the lesson the hard way, that it's really about the people, right? It's really about who is that person that you're trying to help, empower, that it's really about them, it's not about me, it's about me understanding what their need is and in the end, if they don't use it, it's not worth anything. It's actually worth less than nothing because I just spent a lot of my time, energy and resources putting it together. So, I think it really comes down to, I'm sure that gets emphasized by some, but it's really about the people and if you're going to be delivering the type of solutions we're talking about or having that customer facing view, no matter what you're building, you better be able to understand the people and their needs and keep them in consideration with whatever you're doing.

Brian:  Right. I think that was a great kind of closing point here and that's something I would reiterate is, there's math, there's tech, there's IT, there's resources, there's a lot of stuff that goes into ingesting data from systems and theoretically spitting out insights on the other end. If that last mile is not there and no one is engaging with the service, it doesn't matter. And more specifically, what you said, it's not just that you batted zero, you actually kind of batted negative. You're not creating commercial software products, you're not a SaaS. These are internal tools. But, what else could you guys have been doing with that? It's actually a negative return. So, it's really important to get this right and to make sure that there's engagement here, meaningful engagement, right? Actual decisions, support and that you're measuring that because you're actually negative, you're not zero. I think that's a really good point. It's a cost. It's now a cost center instead of a ... probably you're supposed to be saving money or helping make money maybe, but definitely not losing it.

Steve:  Right. Absolutely.

Brian:  So, I think that's a really good closing thought. But yeah, this is great. So, I'm curious, are you on a ... do you have a website? Do you have Twitter? Are you on Linkedin? Is there a place people could like learn more about what your work is or anything?

Steve:  You can find me on LinkedIn. I do have a Twitter. I have a very minimal Twitter presence. I'm @OlDirtyBarGraph.

Brian:  That's great. Old Dirty Bar Graph.

Steve:  Old Dirty Bar Graph..

Brian:   Cool. Well, I'll throw your LinkedIn and Twitter handle into the show notes so people can access those. Steve, it's been ... this is a Steve Bartos again from Worthington steel and he manages the predictive analytics group out in Ohio. This has been super fun. Thanks for coming on and telling us a little bit about the world of steel and analytics.

Steve:  It's been a pleasure. Hopefully people learned as much from me that I learned from you during this hour.

Brian:  Cool. Well, awesome. I look forward to chatting with you again soon.

Steve:  All right. Thank you.

Brian:  Okay.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe for Podcast Updates

Join my DFA Insights mailing list to get weekly insights on creating human-centered data products, special offers on my training courses and seminars, and one-page briefs about each new episode of #ExperiencingData.