Ganes Kesari is the co-founder and head of analytics and AI labs at Gramener, a software company that helps organizations tell more effective stories with their data through robust visualizations. He’s also an advisor, public speaker, and author who talks about AI in plain English so that a general audience can understand it. Prior to founding Gramener, Ganes worked at companies like Cognizant, Birlasoft, and HCL Technologies serving in various management and analyst roles.
Join Ganes and I as we talk about how design, as a core competency, has enabled Gramener’s analytics and machine learning work to produce better value for clients. We also touched on:
- Why Ganes believes the gap between the business and data analytics organizations is getting smaller
- How AI (and some other buzzwords) are encouraging more and more investments in understanding data
- Ganes’ opinions about the “analytics translator” role
- How companies might think they are unique for not using “traditional agile”—when in fact that’s what everyone is doing
- Ganes’ thoughts on the similarities of use cases across verticals and the rise of verticalized deep data science solutions
- Why Ganes believes organizations are increasingly asking for repeatable data science solutions
- The pivotal role that empathy plays in convincing someone to use your software or data model
- How Ganes’ team approaches client requests for data science projects, the process they follow to identify use cases for AI, and how they use AI to identify the biggest business problem that can be solved
- What Ganes believes practitioners should consider when moving data projects forward at their organizations
Resources and Links
Ganes Kesari on Twitter: @Kesaritweets
Ganes Kesari on LinkedIn: https://www.linkedin.com/in/ganes-kesari/
Quotes from Today’s Episode
“People tend to have some in-house analytics capability. They're reaching out for design. Then it's more of where people feel that the adoption hasn't happened. They have that algorithm but no one understands its use. And then they try to buy some license or some exploratory visualization tools and they try their hand at it and they've figured out that it probably needs a lot more than some cute charts or some dashboards. It can't be an afterthought. That’s when they reach out.” — Ganes
“Now a lot more enquiries, a lot more engagements are happening centrally at the enterprise level where they have realized the need for data science and they want to run it centrally so it's no longer isolated silos.” — Ganes
“I see that this is a slightly broader movement where people are understanding the value of data and they see that it is something that they can't avoid or they can't prioritize it lower anymore.“ — Ganes
“While we have done a few hundred consulting engagements and help with bespoke solutions, there is still an element of commonality. So that's where we abstracted some of those, the common or technology requirements and common solutions into our platform.” — Ganes
“My general perception is that most data science and analytics firms don't think about design as a core competency or part of analytics and data science—at least not beyond perhaps data visualization.” —Brian
“I was in a LinkedIn conversation today about this and some comments that Tom Davenport had made on this show a couple of episodes ago. He was talking about how we need this type of role that goes out and understands how data is used and how systems and software are used such that we can better align the solutions with what people are doing. And I was like, 'amen.' That's actually not a new role though; it's what good designers do!” — Brian
Brian: Hey everyone, welcome back. Time for Experiencing Data again. And today I have Ganes Kesari on the line. Ganes is the co-founder and head of analytics at Gramener which is a consulting firm based part in India, but also in New Jersey. Is that right Ganes?
Ganes: That's right, Brian.
Brian: Cool. How's it going?
Ganes: I'm doing good, how are you?
Brian: I'm doing great. And are you guys in other places besides New Jersey and India?
Ganes: We're in Singapore as well, so that's the third geography.
Brian: Okay, got it. So you guys are all over the world spreading the analytics and data science love around the planet. So you guys are a consulting firm, correct? You help companies with analytics and data science projects, products, solutions, correct?
Ganes: That's right, yeah. We primarily focus on insights and storing. So identifying insight, using machine learning analytics and converting them into engaging data stories by using information design and the concept of storytelling.
Brian: Got it. So if I read your background correctly, you have an MBA, you're an engineer by training, but you're working at a company that like blends, designs, storytelling, and then obviously the data side, the AI analytics work. So how did that come to be?
Ganes: Okay, so that's a great question. So I've been an engineer by training, by profession and once we started Gramener about 10 years back, by the way it is our 10th year anniversary next year. So when we started Gramener the focus was more on consumption of data and that's where we saw the biggest gap in the market was. People were already comfortable with looking at data and early stages of analysis but whatever analysis was done, they had trouble in taking decisions with it or even understanding it. So that's what we wanted to focus on. And when you're talking about consumption of data, then design and understanding the user is part of it. So as part of the journey we kind of trained ourselves on information design, user design, user experience design. And at some point I set up the design team for Gramener and I was heading it.
Ganes: So engineered with Robinson but I was building the site for a couple of years but I handed over to one of the other co-founders. So that's how I got into design and I've been very passionate about addressing this challenge of business and how to bring in data to provide some solutions. So that's where I've seen these twin objectives of insights and communicating using design as very possible and empowering when we talk to users. So trained others and we see that the reception is good, then we take this approach.
Brian: Got it. So I'm going to make a broad generalization here. I mean I have talked to a lot of people in your shoes that are running firms like this, but my general perception is that most data science and analytics firms don't think about design as a core competency or part of analytics and data science, at least not beyond perhaps data visualization. Like the most, as I call them altitudes, it's kind of the lowest altitude surface level.
Brian: So is there something that prompted you guys to feel like, hey, this is actually, this isn't like an extra thing that you glom on at the end, it's actually like part of the core of the service that we provide in that customers actually really value. They actually get a lot more value out of the technical work we do because it's presented properly. Like was there a light bulb moment here that happened or it's not … your situation is abnormal to me and maybe I'm wrong about that, but just in my experience it's an abnormality to me.
Ganes: Yeah, you're right. Not many organizations who focus on analytics combine visualization or the form of thinking. So we didn't have a light bulb moment. We initially started off with energy analytics very, very early days and we had come up with algorithm to work on energy consumption and analytics in that space. What we found was while the analytics was good, it didn't work as much. And when we looked at why people were not able to connect to it, that's when we stumbled upon the data consumption visualization. And then eventually when we spoke to some people and said, “Is this really a problem for you?” A lot of people immediately responded and that was actually a pivoting moment for Gramener. Then we move away from the energy analytics of stock and then we moved fully into visualization and analytics very closely tied to that.
Brian: Got it. And so do you have clients that come to you that want to pull this apart and say, well, we don't really need that, we just need a predictive model for X. Can you build us an AI for that? But we don't really need that design stuff. Like do you ever get people trying to kind of cherry pick out your process and you know they don't want you to prescribe what's necessary, they want to tell you what they need. Do you have to go through that sometimes?
Ganes: Interestingly, there are some cases, but the large majority when clients come to us, they want to do something with data. So they are very broad, vague statements saying we want to get started with our data science journey or we’re having a problem with our sales function and we want to use data to solve it. So that's why a majority of clients come to us, those kinds of problems. And with that we have the full flexibility on starting with a consulting exercise, identifying and then going the full arc. Those other scenario, like you mentioned, there are clients who think that they already are doing enough of the Tableau or other tools, visualizations and they think they have in house capabilities. And in some cases where they are lacking only the machine learning or they need to expedite their AI, the build phase, there are some clients who come to us and specifically ask for analytics.
Ganes: But I would say that they're minority, but there are organizations who have those needs and vice versa. There are clients who also come and say that, “We have in house team of data scientists they are very good in statistics. Can you help us?” Only with a defined because that's one area where we are lacking and we see a lot of good solution on your website. So minority, both these cases but they do happen.
Brian: Got it. So are the reason those people pick up the phone is because you know you said they have talented technical teams in house, but are the symptoms like you know no one's using our stuff, like we've spent all this time building you know some product or output or you know tool or whatever, a model for something and it's not getting used or it's not understood. Or like what's the symptom that makes them feel they need your help especially on the design side?
Ganes: Mmhm. So people tend to have some in house analytics capability. They're reaching out for design. Then it's more of what you're saying Brian where people feel that the adoption hasn't happened. They have that algorithm but no one understands the use of them. And then they try to buy some license question or some exploratory visualization tools and they try their hand at it and they've figured out that it probably needs a lot more than some cute charts or some dashboards. It can't be an afterthought. And that's when they reach out and say, “Can you help us connect to the users? Can you help us bring the user to our solution? Is there some other way that we can present and kind of build a workflow?” So in those cases, you have started one of the strong motives when people finally realize that analytics alone isn't sufficient.
Brian: Yeah. And typically do you interface with this business stakeholder or is it more the technical team that's not seeing, they're not getting the results, they're not delivering the results the business is expecting and so they're trying to you know raise the quality of their work? Or is it more of a business person saying, man, our data science, we have these PhDs and they're really smart but we don't understand. Like how does it come into you guys? Like and how do you pull that together? If the answer is both audiences then are you pulling these groups together and how does that work?
Ganes: Interestingly a few years back, when you talk about five, six years back, a lot of those enquiries used to come from the business teams where IT and broad user the technology teams at that time I don't think we had much of the CDO rule. It was either IT or the business and it was mostly from the business side where they used to reach out to us and say that they have a hard time getting what we want from our technology department.
Ganes: So can you build it specifically for us and it can be only for the business unit. So that used to happen a lot and of late with the data wave and this pretty much backing every organization, every department, organizations have set up the chief data office role and they have several functions around data. And now a lot more enquiries, a lot more engagements are happening centrally at the enterprise level where they have realized the need for data science and they want to run it centrally so it's no longer isolated silos. We are seeing a combination but increasingly more at the organizational level and from technology organizations aspects
Brian: Is the gap between the business and the data science analytics organizations getting smaller in your opinion then as well? Is that what you're saying?
Ganes: Yes. I think early stages in conversations compared to say the differences related were five, six years back to now. There is definitely more conversations happening. The gap is smaller at least in some of the major organizations who made early touch. But if you look at the industry as a whole, there's still a lot of ground to cover, but for those who have started I see a difference.
Brian: Do you think the organizations that have a lot of ground to cover, do you think everybody in that space has to go through like a maturation process? Like you got to feel the pain, you got to spend a bunch of time building bad stuff and then eventually you'll get tired of it or you know your business colleagues will get tired of not getting what they want and then the light bulb goes on. Or do you think the journey is going to be, like it's culturally there's something changing like at a broader level in terms of awareness about the importance of this or do you think it's still like a passage you have to go through?
Ganes: Yeah, I think that it’s more broader now.
Ganes: Maybe part of the hype around AI where people suddenly have started talking about some of the cool stuff. How much value it actually delivers to the organization is still under question. There are a subset of organizations who have been able to manage and use it well, but thanks to some of those, the broader movement earlier it was big data and now AI and some of these buzzwords are prompting organizations to put in money. And also there are organizations who have genuinely seen some differentiation by adopting data. So there are some leaders who have been able to achieve it in the market. So naturally there are others who want to emulate and I see that this is a slightly broader movement where people are understanding the value of data and they see that it is something that they can't avoid or they can't prioritize it lower anymore. So I think it is starting to … that's where I see that when I mentioned that enterprise wide and technology organizations are taking it up which means that there's a mandate given to them and there is a budget being allocated there. So it's more of that kind of a case.
Brian: Got it. My perception, in fact I was in a LinkedIn conversation today about this and some comments that Tom Davenport had made on this show a couple of episodes ago and one of them was, and he was talking very much about how we need this type of role of a person that goes out and understands how data is used and how systems and software are used such that we can better align the solutions with what people are doing. And I was like amen. Like and by the way, that role has existed for about 10 to 20 years in the digital native space. You know usually this the realm of product design and then underneath that you might have planners and UX designers and design researchers and usability engineering. There's all these kinds of sub domains there.
Brian: My general feeling is that it's almost like the data world isn't even aware of this. And like another thing is this analytics translator role, which in my understanding I haven't had a lot of direct relation to it, but just in my research and study of it, that's also recreating the wheel of what is known as a technical product manager in a software company.
Brian: And the irony here being that it feels like a lot of the non-digital native companies are worried about the digital native companies yet they're not picking up the practices of the digital native companies such as they've already solved these problems. There's already people trained in this. Like you can't hire an analytics translator, well maybe you should change the job title to data product manager because there's a lot of great product managers out there and it's like reinventing the wheel and then you've got to go through that entire journey and skillset and it just, either that or I don't know.
Brian: Sometimes I feel like maybe I'm like this outsider with this weird perception, but do you see this as well at all? Like some of this exists already. We can learn from what you know other industries have done and our competitors and there's a way to fast forward a little bit here or I don't know. What's your opinion?
Ganes: Yeah. And firstly, I loved the podcast with Tom Davenport.
Brian: Oh thank you.
Ganes: I listened to it and yeah, it was good stuff. And I have a lot of Tom's articles and some of his books. So that day a lot of the stuff he mentioned, I completely relate to and I would totally watch for it. And coming to your question that's a great point and that's something which I've been telling people that we don't have to redo all of the stuff. There's a lot of similarity between how technology adoption or in terms of technology of say 10 years back, like you're saying digital native. So there are lots of similarities, whether it is a data product manager or technical manager role or in terms of change management and how do you make sure that innovation in technology earlier could have been ERP or any of these other systems. How they can get implemented organization wide. How they get people onto email. How do you get people on to some of those, the technology solutions?
Ganes: So there's a lot to learn. We don't have to redo all of this. So I completely agree. And if you look at some of those concepts, titles and framework, they are being repackaged. There is some adaptation needed for the data work without doubt because when you're talking about volumes and in terms of the speed of operation, there are certain things you'll have to account for and adapt it to. But broadly, if you look at the concept and intent, there's a lot of similarity. So there's a lot to learn from. So you're not alone on that. That's my view too.
Brian: Yeah. I appreciate that because I thought, I was like, sometimes I'm like, man, am I like on Mars here. Like I don’t understand like, but it's good to know and I understand every place has its own ... it's just like talking about agile or software development. Every client I've ever gone to the first thing they tell, I mean they want to talk about process and I'm not a process person, but oh, we're using agile, but it's not like traditional agile, it's this other thing. I'm like, guess what? That's normal and everyone is using their version of it. And really what's important there is, is there any agility happening regardless of the process you're using, the spirit and the intent of that is what's important and it's natural I think to adjust these things to your culture and you know the speed at which you can move and like how much risk is involved in the work you're doing. There's a lot of different factors there. So I think that's natural.
Ganes: Yeah. So since you do a lot of those consulting and helping organizations with the use case those who are getting started, so then you explain these processes and show them how to get started. And have you heard feedback that, hey, by the way, this is very familiar to what we did for our digital initiative. Have people come back with that feedback?
Brian: No. I mean most of the time the work that I'm doing with clients, for most of them, it tends to be places that I would say are not mature in their design capacities.
Brian: So it usually comes across like they've never done this before. They understand the value once they've participated in the journey and going through it, they start to understand why we do it. But it's not obvious and it's not normal to them. I would say it's typically new and that's partly probably why they're calling right, it’s because the way we're doing it now is not working.
Brian: You know...so... But I'm one data point there as well.
Brian: Looking back like you know you've been at Gramener for, did you say 10 years now, is that right?
Brian: Coming up on 10. Is there like one main thing that you would change about either the service offerings or your approach or anything like having 10 years of experience now and you've lived through all the buzzword changes for our market from you know BI to big data and it's like whatever the hype, change your quick, change the marketing.
Ganes: Yeah let’s keep it thumbs, or else we need to run a new campaign.
Brian: Right. Is there something you would change?
Ganes: You mean at Gramener the way we've approached things?
Brian: Yeah, like either skill sets or I would have hired more of these people. I would have built a team to do this. I would have, I don't know, just something that you're like, well we're stuck here, it was a learning moment, wouldn't have done that that way if I had another chance.
Ganes: Yeah. We always have areas of improvement. Don't we all. Yeah. So one thing I would say is we’ve seen a lot of similarity across these problems. So while we have done a few hundred consulting engagements and help with bespoke solutions, there is still an element of commonality. So that's where we abstracted some of those, the common or technology requirements and common solutions into our platform, which we use as an actual Draper's event. Clients come to us, we are able to, we don't have to start from scratch, we are able to use that and quickly we have certain say libraries of modules and certain ways of doing analysis. So all of those are abstracted to some extent from the platform. So that helps us to deliver solutions quicker and reuse stuff.
Ganes: What we think would also be useful as across industries, when we talk about even domain solutions within an industry, let's say banking, there's a lot of commonality across organizations and in certain use cases, if you're talking about retail banking or if you're talking about investment banking, there's a good scope for coming up with deep data science solutions which are verticalized and which address certain repeatable workflows where we can take the notch of the usability or alter the box solutions even higher. So that I think we could have probably focused a little bit more because earlier when we started we were looking at every problem to be unique and then over time we've seen the commonality.
Ganes: So maybe if we can look at a specific domain use case or a specific solution that might have been useful for a lot more clients, usually there is the time to market question even if it is one month as opposed to six months to take something to market. But not everyone might have that one month. So if you have something out of the box, which is say maybe customizable within a week, that's probably more powerful for clients and it's a better offering and something which is even more compelling to offer as an organization. So I think identifying that mix, the product roadmap, the product mix would have been even better.
Brian: Yeah. Not to pull our podcast into like consulting talk or whatever, but I would tend to agree with that. I've also tried to come up with more reusable recipes for the work that I do, especially when there's repeated work so that I'm … you know by you and I each having our own processes, we can accelerate and anticipate some of the work you know that's needed to help our clients move more quickly.
Brian: I used to be as a designer, it's like, oh wait you can't productize that. You can't systematize that. And I've really changed my own perception about, there's definitely like a process by which I go through I just hadn't really written it down and followed it and it really helps accelerate certain things. So anyhow I'm with you there.
Ganes: Yeah. You're also perfect intersection of design and engineer.
Ganes: So I think we tend to agree on a lot of things.
Brian: You would not want to see my code. I wrote some code the other day for my website. Oh man. It's like, I even know it's bad though. So at least I'm that aware that I can look at it and say, that is not how one should do that and it's good that no one's paying me to do this anymore. I'll call you guys if I need code. This actually though like kind of sort of ties into my next question. So we talk about data storytelling out here and so when I hear this I always think there's kind of this branch point, right? It's one thing to do storytelling in the format of, hey, we did this project and it was an analysis and there's going to be a report and then we'll have some kind of readout about what we learned and it's generally static, you know maybe there's a PowerPoint or a write up or something and it's very controllable from a design and delivery process, you can really control the story.
Brian: On the other branch is software, right? When you're building a data product where the storytelling is more likely implicit, right? In the interface, in the experience that has been designed, the story is implicit. And to me that's a bigger challenge. And I was curious, a) do you see that that way that there's like you're trying to bury the storytelling into the experience of using an application and do you approach that differently than you do when you're creating an ad hoc, like a onetime kind of story and report? Like how do you think about that?
Ganes: If I understand the question right, you're talking about two scenarios. One is that an ad hoc story you create one time so you have more flexibility in terms of how the narrative flows, the kind of the cues, the visuals and interactivity you use versus something which you create a repeatable one which refreshes with data and it is probably used for a few months if not a few years.
Brian: Yeah. Like a living, like you know a tool for the sales team or something, right? And it's like this is a living application or it's integrated in the CRM or whatever. There's no literal storytelling. I mean, I suppose you probably could design that into the experience, but you probably wouldn't for a number of reasons. But some of the themes are the same, right? There's like a conclusion you know typically, at least in my work, I'm always trying to help clients surface conclusions and or opinions from the data first and then gradually exposing more detail and evidence to back those up as needed, but not leading with evidence and then asking the customer to put together the, what's the conclusion from this? Like what is my action that I'm supposed to take here? And I was just curious if you guys have a, how do you think about that? Because we obviously can't do a custom handheld story you know in a dynamic system like that. So are they two different things? Maybe they're not storytelling maybe they're just two separate things. But I was just curious if you even think about them that way.
Ganes: So that's a bulk of work we do. When you talk about storytelling and the information design the visual part of it, that's closer to what we do because organizations approach us when they want something repeatable, not some ad hoc report mistaken for capable with some of these self-service data discovery tools. But if they want something, let's take the example of say the sales function and they say that I want to enable my frontline sales team to identify the right plans and basically sell better. So how do I come up with a visual narrative which people can use to answer those questions, the common questions? So then we create a repeatable flow. So the way we approach a test, look at the user study requirements and very similar to the user centered design principles. Start with that find out what scenarios they would want to use the application, what kind of information they want.
Ganes: So once we do that, the person, the identification of the requirements and all of it. So the way we can build a story into this or can say the workflow into it is where do they need to start. What are some of the starting point scenarios and how you customize it to the role? So with that if we define a sequence of views. For example, they start with a summary then do how today looks. And then they get into answering specific questions that which customers should I target today. So then that gets into a more customer specific view, maybe pulls in results from a recommendation engine and presents the results. And then the next question is, assuming I've identified the sales person I've started with the days somebody they have looked at which customers to target and how they want to go into one or two customers before they make that final call.
Ganes: So then I get into a customer deep dive view where it talks about the complete relationship of the organization with this single client and what had they sold in the past, what is the status of the relationship, what did they buy and what are they likely to buy? If we can bring in all of those analytics and present the view. So here what I'm trying to get to is there is a natural workflow, a certain set of questions and if we can identify the way that we want to help them and what is the process that they would follow or the workflow that they should adopt, then we will be able to come up with views which are standard, which can be used repeatedly. And at the same time they are not disparate dashboards saying this is a customer view and then you go to this tab which has a different one because people don't tend to use applications or technologies that way.
Ganes: So if you embed all of this together it becomes a natural customized application for that role, for that individual. And so that's the combination of storytelling for an individual. You come, you bring in all of aspects of data science. So this is definitely different from an ad hoc one time story where you have even greater part where you already know the conclusion. You just start with the conclusion and maybe you will go even more specific. So there is, when you're looking at a repeatable one, so there are some elements of the story telling you will bring in, but given that the data is likely to change, the scenarios will be different tomorrow. There are certain broad elements you bring in and then you devise that workflow and bring in only those elements and those interactivity. So that's the difference.
Brian: Yes. No I understand what you're saying and one thing I think I would reiterate here is that as you kind of, so you use this sales decisions report type of example here to make the point is that ultimately the core data science part there is an ingredient in this cake that you just built out. You just spelled out how to bake a cake, right? But that might just be the flower and it might be the most important part, arguably the most where the best IP is, which is you know the recommendation of which customers to call but that's not an experience and that may not be what actually gets the salesperson to pick up the phone. Like in your case, maybe they need to dive into the CRM and they need a link to the CRM to go look at past behavior and other stuff that maybe that's already there, it's already in the CRM or it's already in some other BI tool, but it's that linkage from the recommendation you gave to something that is part of their natural workflow, which is I'm never going to just call a client.
Brian: Even if you give me like, hey, we have an 82% predictive probability that this person will buy something, here's the phone number. And they're like, there's no freaking way I'm calling that guy without going and seeing when did we talk to him last, what did he buy last, what's the sentiment like? There's all these other things that are part of that sales guy's journey before he actually dials a phone number and so you can get all the data science part right, but understanding, having empathy for that person's job, this man or woman or whoever it is that's doing the selling, they have a workflow that they want to go through and I think that's what I liked that you spelled out here is that you can nail the data science part, but there's this other, there's the whole workflow that you want to support whether you do that work or not, you should at least acknowledge that that may be a linchpin in getting someone to use your software or your application or your model or whatever. There's more to it than just the data science piece. I don't know, did I summarize that?
Ganes: Yeah. So the one part I think, just to clarify, in addition to the data science and the algorithm piece, the flow which I explained, I was talking through a visualization driven flow where a person going through this workflow has imagined a visual application. That's how usually we work with our clients. We deliver all of this as an interactive user interface. It can be embedded onto the CRM, which they use for instance, if it is Salesforce or any other CRM where the workflow’s built into that. Instead of looking at a set of tables or separate screens, all of these are visual dashboards that interactivity built them and the algorithm outputs are automatically embedded as part of that.
Brian: Sure. And my point isn't so much about how many screens or artifacts there are or which system they live in. The point was that it sounds like your approach here is to design a solution around the activities that a salesperson does. Ideally delighting the, at least meeting the requirement, but ideally maybe you find a way to actually delight them with an experience they didn't maybe even know is possible. And you know analytics and data can help us do that. It can also make it harder. It can also make it worse sometimes when we shovel too much irrelevant information at the wrong time. So yeah, so I get what you're saying there.
Brian: Do you involve design in the design of your, so the algorithm design, right? So when you're working on advanced analytics and predictive modeling and things like this, do you have your designers involved with like thinking about who these customers are and then helping with the data collection in terms of like what is the right data and should we be factoring this data in and why aren't we factoring in X, Y and Z as well if we're going to build a model on this because hey, we've talked to these people and this is clearly a part of their decision making. Do you ever get them involved in that side where they're working with the data science group really kind of upstream?
Ganes: Yes. Always we do the data discovery phase before we get into the solution building.
Brian: Uh huh.
Ganes: So we start off usually with understanding of the business problem, the challenges that the stakeholders are facing. And soon after we dive into the data saying if these are the problems, these are the questions we want answered, what data do you have at hand already? So we do a data landscape study to find out what internal data you already have. And there we suggest that maybe you don't have those data already but you need to collect these in the future. They become future data engineering projects. It could come and take a couple of months down the line but this is useful to answer an important business question. So that is a useful sub-product from the destination.
Ganes: And apart from that there could be some other external sources which organizations aren't aware of. So we always come across cases that, for example, if you're running a digital marketing initiative, there's a lot of public streams available from Google and from other sources that you can create or connect to an API and pull out which most organizations are not aware of. So identify internal, external data sources, understanding the landscape and finding out what use we can put the data to is something which we get involved in because it's, to really solve the problem and to solve it in an effective way there could be a lot of ways to the final objective. So data discovery is very much a part of it. So we have the team involved and attempt, we have active discussions with the business stakeholders. Because often within organizations of business users have no clue about what data is available.
Ganes: That's probably the technology, the IT teams domain and when we have a common discretion we need both of them together and say that this is your problem, this is the data that you need to solve it and then have a mutual conversation. So that's when they also start asking questions in terms of how can we use this better. So that we find is very useful to come up with the corrective and a powerful solution.
Brian: Got it. One last question here as we get kind of towards the hour here. This has been great by the way. Really glad to have you on the show here to share some of these thoughts. How do you help when you bring these clients in, especially on the business side, it seems like today companies, especially in the business and product side need help identifying the right types of data science projects. So it's the coming in the door wanting to do machine learning or AI not really knowing what that is or where it's applicable and then you kind of have to unpack that, right, and get it back to a problem and then work it forward. How do you go about doing that like or, that's an assumption there. Are you actually having this, do you see this problem coming in the door and then do you guys have a process that you use to help clients actually figure out is there a reason to use this tool or not and here's why.
Ganes: Mmhm. Yeah. All right. Well, that is usually the first conversation we have and what we see in the industry is people usually wherever they have internal data science teams or some capabilities, some people hired say some data scientist type, the problem we see is that they find something interesting and they go off to that and they may not really solve a business challenge. For example, there is open AI, have you heard of the GPT-2, the text generating algorithm. So that can write almost as well as a journalist. So a data scientist who comes across this and that has been made public along with the training data and all of the pattern makers. So you can pick that up and then use it to create newsletters for your clients, write internal stuff for your employees. But is that really important for the organization? Maybe not.
Ganes: So that's why when clients approach us, we do not go with what is interesting or what is even urgent to some individuals, but we try and find out what is the biggest business problem to be solved. Where can the organization get the maximum impact in terms of revenue or cost savings? We help the organization identify that and that becomes a potential project. We identify actually a list of such initiative and do a discussion along with the senior folks in the room, executives are needed for those discussions to identify and prioritize which projects they should take right now, three months or six months down the line. We help them build a data science roadmap by looking at factors such as business impact, the urgency and in terms of the feasibility, do you have the data or do you have the budget and technology for it, a combination of these factors to pick the right projects and build a roadmap. That's something we always do for our clients.
Brian: Excellent. I love it. I like that you align that to you know the business objectives that are happening there and not necessarily just backing it out of where do we have some data? Okay. Throw a hammer at that you know because you can again, create a solution to no one's problem. And at that point we're back to rehearsals again. It's like, another nice rehearsal. No performance though. I don't know, music analogy there. It's like some point you got to go to the gig and play a concert and like people are paying money to go see it, so it's got to produce something.
Ganes: Yeah. Enough of pilot, let's productionize something.
Brian: Yeah, exactly. Cool. Well thanks for coming on the show. Do you have any final takeaways that you'd like to share with the audience? It's been great having you and again, this is Ganes Kesari from Gramener, but any final words?
Ganes: Yeah, sure. Thanks for having me, Brian. I would emphasize that there is a lot of attention and investment going into data. If all of the practitioners and people who are interested in starting projects look at the right projects to get started, which are the most impactful ones, and also making sure that we focus on the user aspects and option to make sure that not just the data science team or the stakeholders, but everyone, the final users can also benefit from it and adopt it seamlessly. So that needs a concerted effort and that needs to be planned upfront. So that's been our learning by what we're doing all these consulting engagements over the past decade. So I hope there are some takeaways from there and thanks again for having me Brian, it’s a pleasure talking to you.
Brian: Yeah, it's been super fun. And just last thing here, where can people follow you? Is it at Gramener.com or what's the URL?
Ganes: Yeah, the URL is Gramener.com G-R-A-M-E-N-E-R dot com and I'm active on LinkedIn and Twitter as well. So do reach out to me and I’ll answer any questions you have.
Brian: Great. I will definitely put your, what's your Twitter handle?
Ganes: It's @Kesaritweets.
Brian: Kesari tweets.
Ganes: Yeah, K-E-S-A-R-I tweets.
Brian: Awesome. Cool. I will put those in the show links. And again, thanks for coming on the show. It's been super fun.