Building a SAAS business that focuses on building a research tool, more than building a data product, is how Jonathan Kay, CEO and Co-Founder of Apptopia frames his company’s work. Jonathan and I worked together when Apptopia pivoted from its prior business into a mobile intelligence platform for brands. Part of the reason I wanted to have Jonathan talk to you all is because I knew that he would strip away all the easy-to-see shine and varnish from their success and get really candid about what worked…and what hasn’t…during their journey to turn a data product into a successful SAAS business. So get ready: Jonathan is going to reveal the very curvy line that Apptopia has taken to get where they are today.
In this episode, Jonathan also describes one of the core product design frameworks that Apptopia is currently using to help deliver actionable insights to their customers. For Jonathan, Apptopia’s research-centric approach changes the ways in which their customers can interact with data and is helping eliminate the lull between “the why” and “the actioning” with data.
Here are some of the key parts of the interview:
- An introduction to Apptopia and how they serve brands in the world of mobile app data (00:36)
- The current UX gaps that Apptopia is working to fill (03:32)
- How Apptopia balances flexibility with ease-of-use (06:22)
- How Apptopia establishes the boundaries of its product when it’s just one part of a user’s overall workflow (10:06)
- The challenge of “low use, low trust” and getting “non-data” people to act (13:45)
- Developing strong conclusions and opinions and presenting them to customers (18:10)
- How Apptopia’s product design process has evolved when working with data, particularly at the UI level (21:30)
- The relationship between Apptopia’s buyer, versus the users of the product and how they balance the two (24:45)
- Jonathan’s advice for hiring good data product design and management staff (29:45)
- How data fits into Jonathan’s own decision making as CEO of Apptopia (33:21)
- Jonathan’s advice for emerging data product leaders (36:30)
Quotes from Today’s Episode
- “I want to just give you some props on the work that you guys have done and seeing where it's gone from when we worked together. The word grit, I think, is the word that I most associate with you and Eli [former CEO, co-founder] from those times. It felt very genuine that you believed in your mission and you had a long-term vision for it.” - Brian T. O’Neill (@rhythmspice) (02:08)
- “A research tool gives you the ability to create an input, which might be, ‘I want to see how Netflix is performing.’ And then it gives you a bunch of data. And it gives you good user experience that allows you to look for the answer to the question that’s in your head, but you need to start with a question. You need to know how to manipulate the tool. It requires a huge amount of experience and understanding of the data consumer in order to actually get the answer to the question. For me, that feels like a miss because I think the amount of people who need and can benefit from data, and the amount of people who know how to instrument the tools to get the answers from the data—well, I think there’s a huge disconnect in those numbers. And just like when I take my car to get service, I expected the car mechanic knows exactly what the hell is going on in there, right? Like, our obligation as a data provider should be to help people get closer to the answer. And I think we still have some room to go in order to get there.” - Jonathan Kay (@JonathanCKay) (04:54)
- “You need to present someone the what, the why, etc.—then the research component [of your data product] is valuable. And so it’s not that having a research tool isn’t valuable. It’s just, you can’t have the whole thing be that. You need to give them the what and the why first.” - Jonathan Kay (@JonathanCKay) (08:45)
- “You can't put equal resources into everything. Knowing the boundaries of your data product is important, but it's a hard thing to know sometimes where to draw those. A leader has to ask, ‘am I getting outside of my sweet spot? Is this outside of the mission?’ Figuring the right boundaries goes back to customer research.” - Brian T. O’Neill (@rhythmspice) (12:54)
- “What would I have done differently if I was starting Apptopia today? I would have invested into the quality of the data earlier. I let the product design move me into the clouds a little bit, because sometimes you're designing a product and you're designing visuals, but we were doing it without real data. One of the biggest things that I've learned over a lot of mistakes over a long period of time, is that we've got to incorporate real data in the design process.” - Jonathan Kay (@JonathanCKay) (20:09)
- “We work with one of the biggest food manufacturer distributors in the world, and they were choosing between us and our biggest competitor, and what they essentially did was [say] “I need to put this report together every two weeks. I used your competitor’s platform during a trial and your platform during the trial, and I was able to do it two hours faster in your platform, so I chose you—because all the other checkboxes were equal. However, at the end of the day, if we could get two hours a week back by using your tool, saving time and saving money and making better decisions, they’re all equal ROI contributors.” - Jonathan Kay on UX (@JonathanCKay) (27:23)
- “In terms of our product design and management hires, we're typically looking for people who have not worked at one company for 10 years. We've actually found a couple phenomenal designers that went from running their own consulting company to wanting to join full time. That was kind of a big win because one of them had a huge breadth of experience working with a bunch of different products in a bunch of different spaces.”- Jonathan Kay (@JonathanCKay) (30:34)
- “In terms of how I use data when making decisions for Apptopia, here’s an example. If you break our business down into different personas, my understanding one time was that one of our personas was more stagnant. The data however, did not support that. And so we're having a resource planning meeting, and I'm saying, ‘let's pull back resources a little bit,’ but [my team is] showing me data that says my assumption on that customer segment is actually incorrect. I think entrepreneurs and passionate people need data more because we have so much conviction in our decisions—and because of that,I'm more likely to make bad decisions. Theoretically good entrepreneurs should have good instincts, and you need to trust those, but what I’m saying is, you also need to check those. It's okay to make sure that your instinct is correct, right? And one of the ways that I’ve gotten more mature is by forcing people to show me data to either back up my decision in either direction and being comfortable being wrong. And I am wrong at least half of the time with those things!” - Jonathan Kay (@JonathanCKay) (34:09)
- Apptopia: https://apptopia.com/
- Email: email@example.com
Brian: Welcome back to Experiencing Data. This is Brian T. O’Neill. Today, I have my long-term—you’re probably the person I’ve known the longest so far in the 80 or so episodes we’ve done. This is Jonathan Kaye, the CEO of Apptopia, what’s going on Jonathan?
Jonathan: Hey, Brian. Welcome. Welcome [crosstalk 00:00:50]—
Brian: Yeah, yeah.
Jonathan: —you’re making me feel old early. I like it.
Brian: I know. [laugh]. It’s been seven years or so since we worked together and you run a—I want you to talk about it, but briefly, my perception of it from when I worked on it with you is, you provide market insights for app developers—which are called publishers in the, kind of, app lingo—and you help them understand the competitive landscape, who’s getting downloads, rank history, stuff like this. So, if you’re publishing mobile applications in iTunes or Google Play, you can find opportunities there, you can see what the competition is doing, this kind of thing. So, that’s what Apptopia does. Did I get it right, or is it different now?
Jonathan: No, I think you got it pretty right. Only clarification I would make is that it’s pretty much brands. Most mobile publishers are brands. And so we provide competitive intelligence, mobile competitive intelligence for brands. And, like, one of the cool iterations of things that have just evolved over the last couple years is, they have a pretty big footprint in the public finance. So like, hedge fund and banking segments as well.
Jonathan: So just, like, lots of different ways, you could apply the data to different use cases.
Brian: Got it, got it. So, when we were together, you were really running the product side of that business, and since I’ve gone on—and actually, before we even jump into all that stuff, just for my audience, I want to just give you some props on the work that you guys have done and just kind of watching where it’s gone, from when we work together. And also, the word grit, I think is the word that I most associate with you and Eli, from those times, as I don’t think in all my clients, I’d ever seen the level of grit and passion, almost to the stereotypical idea of what, like, startup founder is, living and, kind of, breathing that without all the BS that kind of goes with that, at least it felt very genuine that you believed in that mission and you had a long-term vision for it, so I just wanted to give you guys some props on your passion for what you were doing.
Jonathan: Yeah, I appreciate that your translation of yelling at each other at 7:30 p.m. is grit.
Jonathan: I am very grateful that this is how your brain has transformed that memory. So, I’ll take it fully man, I take it fully.
Brian: Yeah, yeah. On our catch-up call when we, like, planned—and the reason I had you on the show here is I really wanted you to talk about what it means to productize data into an analytic service to make this quote, “Successful,” whatever that means. And I think you said, quote, “We haven’t succeeded with this yet,” and you kind of felt like you’d take a lot of wrong turns. Which kind of surprised me because you guys have grown quite a bit with your funding and your revenue over the years. Why do you feel like you haven’t succeeded, and where are you supposed to be? And what’s that gap? Tell me what’s behind that.
Jonathan: Yeah. So, when we first kind of envisioned this product—and I don’t know if this was because of our vision, or because we were jaded by some of the players that were already in the space when we started, and that we looked for comps to, we built a research tool more than we built, like, a data product. And so today, we have a lot of good data, we’ve spent a lot of time making that data accurate, we spent a lot of time increasing, like, the breadth of that data, so, like, different data points that help you get different vantage points of the same problem, and I think we should talk more about using data to get different vantage points. But before I get there, what we did is we packaged it together in a research tool. And so essentially, what I started realizing a year or two ago was that when I walked you through the tool, you had a very different experience than when you walked through the tool together because you needed to understand how all the different data points kind of correlated together.
And oftentimes, a research tool doesn’t do that. A research tool gives you the ability to create an input, which might be like, “I want to see how Netflix is performing.” And then it gives you a bunch of data. And it gives you good user experience that allows you to look for the answer to the question that’s in your head, but you need to start with a question, you need to know how to manipulate the tool, it requires a huge amount of experience and understanding of the data consumer in order to actually get the answer to their question.
And for me, that feels like a miss because I think the amount of people who need and can benefit from data, and the amount of people who could, like, know how to instrument the tools to get the answers from the data, I think there’s a huge disconnect in those numbers. And just like when I take my car to get service, I expected the car mechanic knows, like, exactly what the hell is going on in there, right? Like, our obligation as a data provider should be to, like, help people get closer to the answer. And I think we still have some room to go in order to get there.
Brian: Yeah. Jared Spool has the best framing for this—and we talked about this in the show—and it’s called Tool Time versus Goal Time. And the level of tool time that a lot of analytics products put on customers can be really high. And so—with the idea, “Well, it’s so flexible. And it can do—we can let you query any of these 3000 fields in the database and correlate it with any other thing you want.”
But with that flexibility comes the cost of friction, which is, “Well, I don’t—what should I be looking at?” Or, “What’s unusual right now?” Or, “Where should I pay attention?” Like, I don’t know what questions to ask. Like, tell me what’s interesting. It’s kind of that tennis match: “Well, what do you want to know?” “Well, tell me what’s interesting.”
And it’s like, the user’s kind of stuck in the middle. So, it’s finding that balance between flexibility and also providing value immediately that doesn’t require a lot of tooling effort, I think is a valid statement. I think it’s a good place to go aspirationally to move away from the tooling side.
Jonathan: Yeah. I think you and I—Brian, and I’ll speak on your behalf here, so you can correct me if I’m wrong—I think we’re probably both like little bit of geeks in our fields. And so when you and I, almost together, were designing a product, we were almost missing a perspective, which was like, the Digital Marketing Manager at McDonald’s where using the data is, like, 4% of what they do, and how much effort does that person want to invest in order to get the answer? I think earlier in my journey, I let my passion for how I look at the data almost overcompensate for some of the gaps that exist and getting people closer to the answer.
Brian: Can you double-click on that, as Shane Parrish would say? Can you make that a real example? Is there something in the product that you changed? Like, we started out with this thing, and then we found out through people like this McDonald’s marketing manager, and so and so and so that’s not what they wanted, or they weren’t able to use it, and we made this change. Can you talk about any of those changes that you’ve made?
Jonathan: Yeah, so we’re kind of in the process of making a lot of those changes. So, I’ll speak just, like, generally about it, and what I’ll say is that the framework that we’re using is, like, what? Why? And, what now? So, it’s like, [laugh] somebody walked into Starbucks. That’s the what. Why did they walk into Starbucks? Well, because they saw a push message on their phone, for the fact that they just reached a goal, a certain number of free stars on their phone. That’s a potential why that they walked into Starbucks. And the question is, like, what now? What do you want to do with that information? Or what’s the next question that you have? And so typically, the framework is, like, there’s an event, there’s a potential cause for the event—and that could be plural, like, potential causes for the event—and then you need the ability to like research, or act. Like, you either need to, when somebody presents you with potential causes, Brian, you either have an action you want to take, which likely you need to take in a different place than the place you got the insight, or you might have an additional question before you can take action. And so what now could either be all right, boom, Apptopia gave you the information you needed; now you’re going to go run an ad campaign or change this or send out a message. Or the question is, okay, like, Starbucks acquired more users because they started paying for more keywords on Apple. Well, hold on. What keywords are they paying for? So, you might need to go one or two levels deeper until you get to the point where you have enough information to actually take an action. And that’s kind of the what now. And if I even just take it full circle, man, that’s when the research component should be there. So, you need to present someone the what, the why, then the research component is valuable. And so it’s not that having a research tool isn’t valuable. It’s just, you can’t have the whole thing be that. You need to give them the what and the why first.
Brian: Yeah. And I think, Jonathan, that you’ve laid out and people listening to the show will know I’ve talked a lot about designing for workflows, ‘you’ve heard of jobs to be done framework.’ But the point here is someone didn’t just decide out of thin air to just sit down at an analytics product and do something. There is something came before and something will come after, and being able to connect the before to the now to the after, and being thinking about this workflow is really important. So, you laid out a good example there of the questions they’re going to ask.
And then you get to that point—I’m assuming maybe you can talk about this, where’s the boundary of Apptopia? Where’s the boundary of our product? And at what point do we say, that’s a wall? You’re outside of the walled garden now. Now, you’re into your other tools, or whatever.
Is that something you think about? It’s kind of like that’s really getting outside of our product, but we know someone is going to click to Facebook ads business platform, or whatever, and run a campaign, but now they’re out of our ecosystem; it’s not our job to do that. Is that something you guys think about is kind of this moving between platforms, services, like this workflows, this kind of thing?
Jonathan: Yeah. Like, boundaries [laugh] are, like, an ongoing debate. And so it’s still up in the air, but here’s what I feel pretty confident in. But at least for Apptopia, we’re, like, an intelligence platform. Like, our job is to give you signals, and we don’t want or need to be the platform where you act on those signals.
Some of the biggest companies in the world do that: Facebook, Google, Twitter, et cetera. Like, I don’t want to compete with them; I want to make their users smarter. And so I think there’s a clear kind of wall between insight and action. Actually, the interesting debate is something like news stories. Think about all the different data sources where, like, a news story could theoretically be one of the whys that an event occurred, like a spike occurred, right?
So like, we were—and I’m not a TikTok user, but one of our customers had a huge spike in downloads, and we were talking to them and they were saying it was as a result of, like, a TikTok influencer, recorded, like, a funny video using their app, and they got a bunch of downloads as a result of that, right? And so I’m probably not going to like scour TikTok like, that’s probably not an angle that we’re going to take, but I do think there is a discussion to be had. Even though we don’t own that data, do we have some responsibility to aggregate and feed some version of that data into the product as a potential why, right, even if the why isn’t within our data set? If the what is—what’s the obligation to present why’s that aren’t in our data, that’s actually… well, I don’t know if I could ask you questions, but I’d be curious to hear your opinion on that. Like, where do you think that line is? From the new story perspective?
Brian: Well, I think two things. I mean, obviously, doing customer research will expose some of the answers to those questions. How much do they already have a process in place to go do that work, whether it’s ad hoc or some other platform they use? There are news aggregation tools that do that, kind of, I know, like, some publicists will use those things, right, when they’re chasing top—they’re trying to chase all the activity around their client’s story, or whatever that may be. So, I would probably want to understand that and see, well, if we do that, are we really adding a lot more? And then what are we taking away from?
Because, you know, I’m sure as you know, being the CEO, you only have so many cards to play, you can’t put equal resource into everything. What are we going to not do if we’re going to do that? So, I think knowing those boundaries is important, but it’s a hard thing to know sometimes where to draw those. Like, am I getting outside of my sweet spot? Like, is this outside of the mission?
You know, [laugh] it’s fuzzy. When you know that you’re just part of the ecosystem, you are not the entire ecosystem, you’re the insights part, but some of the lines are a little blurry, like, the insight could be better if we, like, tapped into all this stuff, but building all those API connections and maintaining all that infrastructure, like, is that really—what level of value we adding? I think it goes back to customer research.
Jonathan: And you want to keep the customer in the platform, right? Like, until they’re ready to take an action. And so it’s hard, and that’s when we don’t know the answer to yet. But like, that’s where the gray line is for me. I think insight versus action is clear.
Brian: One of the biggest challenges that data product teams have, especially internal teams, so a lot of the—I don’t even know what the split is because ironically, there’s no good analytics on podcasts, but when I started the show I kind of have two discrete audiences in my head. One of them is companies like yourself—so software industry companies that have SaaS products, intelligence tools, analytics in market spaces, things like this; and then you’d have internal enterprise data teams. So, these are teams of analysts, data scientists, statisticians, et cetera, often either working in Tableau, Power BI these kinds of tools, or they’re developing new applications, prototypes for exercising predictive models, and seeing how different futures might play out based on large datasets.
The common thing that a lot of these—all of the community has is low use, low trust, the adoption’s not there. We gave them what they asked for and they didn’t use it. Do you have any thoughts, any insights on this because this is a general challenge, is getting people especially non-quote, “Data people,” that McDonald’s marketing manager you’re talking about, getting someone like that engaged to actually make some decisions and believe the information. I have my opinions about why that is, but I’m curious if you have any learnings from this, or—because you have the higher bar because—and this is true for all the SaaS companies—someone has to even pay for it. Not only do they use it, but there’s a paywall here, whereas the internal enterprise teams, you’re often building stuff where it’s like, well, if you don’t use it, no one’s, like—we’re a cost center now; instead of providing value are just—we’re spending money. But the bar is so much higher when you’re revenue-dependent. So, comments on, like, designing for engagement, adoption, trust, making people feel empowered, anything you can share with us about that?
Jonathan: Yeah, I have two thoughts. So, the first thought is—and me and you were just talking about, like, a theoretical enterprise organization—but oftentimes what happens in large enterprise organizations is, like, egos prevail. So, you have some product manager that’s making a decision, and like, they think they know it the best, and so the analysts surface some data to them. And they’re like, “Okay. Like, that’s interesting, but like, this is my product. I have a good instinct. I have conviction, I’m going to keep going.”
One of the benefits that Apptopia has is, we’re competitive intelligence. And so we’re kind of saying, “Yo, jackass. Your competitor just did something. And here’s an event that—here’s something that came from that. It’s not my opinion. One of your peers is doing something right now.”
And that’s actually a little bit of a harder signal to not look at because it’s your competitor, your competitors making a move, it’s like you’re on the battlefield, some troops start moving. You need to kind of know what’s going on. And it’s not, like, made up, right? It’s not an opinion. Something actually happened. And you need to understand what it is.
Brian: A decision was made.
Jonathan: Yeah. And by the way, you can be smarter by seeing how somebody else’s decision played out. And so there is some component of that, that I think is harder to look away at. The cool thing is that I think there’s just a lot of competitive intelligence out there, even outside of our industry. I think the amount of competitive intelligence is increasing dramatically, and so good analysts can, like, weave some of that into their own views, and use that as a benchmark to drive more wow-factor or shock-factor internally.
The second thing is that some of the times where we’ve seen—I think, what you’re saying is, like, paralysis, like, lack of action, is it’s just too much data. And so in our reporting to our customers, we try and distill it down to, like, three or four major events that we think have happened. And we give them the ability to research a bunch. But oftentimes, when people present me data, or when I’ve seen it in other companies, they’re presenting a ton of things because they almost want to show that; they want to show you all the work that they did to get to the answer. But the real strategy is, “Give me the one thing, give me, maybe the two things, and if you’re giving me three, they better be, like, earth-shattering.” But like, I don’t think people distill it down into the one thing you can’t not read.
Brian: Right. I have this framework that I created called a CED framework, and it just means ‘Conclusions, Evidence, Data.’ And it’s a framing for designing analytics products that get used. The point being there, most of the time, the customers and users need a conclusion, an opinion, something stated to them before they need evidence about how you came up with it. It doesn’t mean the evidence isn’t important, but in terms of the way it’s presented, and how it’s presented, there needs to be a strong conclusion first.
Most people don’t want to do the work to derive the conclusion from the evidence on their own. So, I totally agree with you on that. And at some point, they may just begin to trust the evidence all the time, and they don’t need to even see it anymore. It’s enough for Apptopia to just say, “This thing spiked. We think it’s because of this, we think—we don’t know for sure, but”—that’s enough.
Like, I’ve believed it enough. I’ve seen the evidence in the past, I don’t need to go look at how they came up with that every single time they came up with that. It’s enough to start believing it. So, I don’t know if that’s what you’re talking about. That’s kind of the framing I think about and so many tools, especially—even software products, I see that they get this wrong, which is mountains of evidence, giant data tables with huge amounts of grids of numbers, and it’s just, like, as one of my old clients called it, “The metrics toilet,” you know?
Jonathan: Yeah. Yeah, we share that view. Like, I love playing cards, and one of the tenets of good card-playing is you don’t play your best card all the time right away, right? And so, like, sometimes people are going to ask you say, okay, like, how did you get there? Well, then you have the evidence, but sometimes you got to keep one of your cards in your pocket here, and there’s a sequencing of how you communicate information that I think most people miss.
Brian: I’m curious, when you look back since we worked together, I’m sure you’ve grown a lot in your business and what you’re doing. Is there anything you look back and, like, “Man, I wish I’d known that then?” What would you have changed? What would you tell your 2015-self about Apptopia, and your work and in the product?
Jonathan: Yeah. This sounds, well—I guess it all sounds obvious when you’re have hindsight bias, but, like, I would have invested into the quality of the data earlier, I let the product design move me into the clouds a little bit because sometimes you’re designing a product, and you’re designing visuals and you’re doing it without real data, right, you’re doing it—maybe you’re talking to real customers, but you’re doing it without real data. And one of the biggest things that I’ve learned, again, over a lot of mistakes over a long period of time, is that we’ve started to incorporate real data in the design process, like, very early in the design process because sometimes, by the way, you can design a perfect user experience that, like, highlights the bugs in your data, right, or like, highlight things that you don’t want to bubble up. And so understanding of data and data dimensionality and data edge cases, and how a design highlights different components of the data is something that I just didn’t understand the importance of that earlier in my career and looking back, I would have invested more into the quality of the data just earlier. It took me probably a couple years after you and I worked together to, like, really get hit in the face with that.
Brian: And tell me about how has that changed the product design process—if at all—how your product teams or user experience design teams work together with the data people, quote, “Engineers,” or whoever it is that is on the front lines of that? Has it changed the way you guys approach new product development, new future functionality, story development?
Jonathan: Yeah, so two things. One, it’s like changed how we view the staging server. So where, like, the staging server was previously mostly like a QA element, it’s now the sandbox. And so we kind of design, like, 80% of the user experience, drop the data into the 80%, play around with it, and then we finalize—we don’t overdesign it before—or we don’t like, say that we’re at one hundred percent final design until we’ve actually played with the data in some version of the container. And there’s ways to do that in Tableau and other BI tools, where you can play with the dimensionality of the data there as well.
We just don’t happen to have as much experience with those tools, but like, there’s a whole ‘nother step there now, which is—maybe prototype is too strong of a word, but it’s like a stage pre-QA, where you kind of pressure test your design with real data, or with seemingly real data prior to moving into, like, finally productionizing the design. Does that make sense?
Brian: Yeah, I call that design QA. You know, it’s the best word I found, which is, like, “We need to exercise this. This isn’t to figure out if there’s a coding error; this is to see if it’s actually useful and usable.” And because sometimes, like you said, you plug that real fire hose of water coming through the pipes, and it’s like, oh, wow, like, the scalings off, and like most of the companies were looking at, like, nobody has this data. You just start to see all these things that you didn’t know to ask about because you just you can’t see it when you just look at a model or look at giant rows of information in a structured database, you’re not going to see that stuff.
So, I’m fully with you there. And I always talk about you need to design with either real data or realistic data, even when you’re doing prototyping of screens and things like this. And I’m sure you’ve probably seen this, but if you’re doing usability testing, or showing a customer something, if you show them you know, Netflix’s, like, downloads, rank history, revenue, and some designer just popped in whatever type numbers, like they had 1000 downloads last week.
Like, what the hell happened? Like, “What is going on?” Like, this the light, like, the sirens are going off in the users head. It’s like, “Oh, well, it’s just a number. We’re not testing.” It’s, like, “But what happened? Why is it so low?” And they can’t let go because it’s so unrealistic.
You don’t want any of that stuff biasing when you’re trying to even figure out the design is right. So, that realism is really important because it can really skew trying to understand is the design, right? Because the data is totally—it’s foreign, it doesn’t make sense. I learned that the hard way in financial services, doing stock research tools and stuff and you put the wrong pricing in for Apple, “Oh, it’s trading at $10 a share,” and people are just, like, losing their mind. They want to know, is this real? Did this actually happen? They want to check the news during the usability study because they don’t know that it’s fake. [laugh]. Like, they just can’t let go, you know? [laugh].
Jonathan: But it ends up distracting, right, like—
Brian: It’s terribly distracting.
Jonathan: —and [crosstalk 00:24:37] from the goal of exercise. So I—yeah, we’re aligned.
Brian: Totally. Yeah. Question for you about the selling of Apptopia. And I don’t know how much—I know at the time when we were together, there was a you know, self-service, sign up myself individually, and then there was an enterprise pick up the phone kind of deal. I’m curious if there’s something different about the enterprise sale, and specifically the relationship between Apptopia and the buyer, and Apptopia and the users of the platform.
And what I’m getting at here is a lot of times in enterprise sales—and this is changing, I think—the person that would be, quote, “Your customer,” is not necessarily the person that’s going to use the service. And I’m wondering how you guys deal with that? Have you seen a difference there between the end-users who are going to actually sit in front of Apptopia and use it for their job versus the buyer, and how do you think about that differently? Because value can have different definitions. Or maybe it doesn’t. Maybe it’s like, if it looks hard to use, then they’re not going to buy it. Like, that is a factor in the buying decision. Can you talk a little bit about that?
Jonathan: Yeah. So, what we see is—and I’ll just keep it my, like, made up McDonald’s example—so what McDonald’s would do—theoretically, again—there’s that they’d have different functions within McDonald’s, maybe McDonald’s International, McDonald’s US, marketing product. Each one of those departments surfaces one or two data points that make up an executive report card that is then serviced to the management team every month. And so our data might feed, I don’t know, 15% of the metrics on that report card. And so in some ways, maybe the CMO or the executive team at McDonald’s is the stakeholder of that report card, but they don’t interact with our data other than in that report card.
And so in that case, like, I actually don’t know that the executive team necessarily even knows that people are buying Apptopia, as much as they’re just happy that the report card that they’re getting touches on different devices across different digital platforms. What I think is potentially a more helpful answer to your question is that we have found that, like, ease-of-use, ease-of-access to data has been a non-trivial selling point. And it’s because oftentimes the person who is our champion or our primary POC, they need to work with the data to put it in a form that either their boss or their boss’s boss wants to see. And so, while it doesn’t make the decision, there are other checkboxes that need to exist.
If all things are equal between us and our primary competitor—I’ll give you an example; we work with, like, one of the biggest food manufacturer distributors in the world, and they were choosing between us and our biggest competitor, and what they essentially did was is he’s like, I need to put this report together every two weeks, and I used your competitor’s platform during a trial and your platform during the trial, and I was able to do it two hours faster in your platform, so I chose you. Because all the other checkboxes were equal, but at the end of the day, if you could take two hours a week back by using our tool, like saving time and saving money and making better decisions, they’re all kind of equal ROI contributors, and so depending on the stakeholder, I do think ease of getting to the data in a format that they can work with, maybe it’s a Snowflake integration, maybe it’s an API, maybe it’s that we’ll dump the data into an S3. Like that flexibility of how we distribute the data to them in addition, just to the web tool is probably, like, 40% of our deals, it’s like a decision criteria.
Brian: Yeah. And was that something that you guys said explicitly designed out? You had thought about that experience, that scenario, this person was—
Jonathan: No, it started with something more ignorant and generic, which is we want to service the shit out of our customers, and we kind of went in with that mentality. So, I essentially created, like, a team of engineers whose job it was is to be resource for our customer success people. And after a year or two of doing that, we realized that, like, 85%, of what that team was doing was formatting reports. Which could be, like, aggregating data, slicing data, even something like, “Hey, I want to see the top 100 companies that do X.” You actually have to touch, like, 10,000 companies to build the top 100, right, and so most of these brands, they don’t have, like, Spark infrastructures, they don’t have tons of experience with Hadoop or other, like, major data aggregation platforms, and so we found that there is meaningful value in doing some of the data manipulation.
It’s like almost data admin work, but it’s expensive to do time and money-wise if you don’t know how to do it. And so we’ve just found that was a need, and we kind of triple-clicked on to that, as a result.
Brian: What advice or tips do you have for—this show is really targeted more at management and leadership here, in terms of hiring people with design and product mentality for this? Because when we talk about data, we usually jump to data engineers, data analysts, data scientists, and especially in the enterprise, the challenge there is there’s very few people that have any type of product or design perspective on how do we actually get someone to use it, to trust it, to want it, to make it actionable? What do you look for in the people that you hire? What skill sets? How important is it? Is it not? You could say it’s not, too; I don’t know. What are your thoughts on that, on what do you look for?
Jonathan: Primarily curiosity. So, I think that environment is changing rapidly, and I end up learning a lot, like, trial by fire. But as our company is growing, like, I want more people that have more experiences. And so we’re typically looking for people that have worked at, you know, not one company for ten years, they’ve had, like, different companies. We’ve actually found a couple phenomenal designers that went from running their own consulting company to wanting to join full-time.
That was kind of a big win because that person had a huge breadth of experience working with a bunch of different products in a bunch of different spaces, and I think you can probably relate to this, Brian, but like, you kind of take maybe, like, a half a percent of something from here, maybe two percent from here. But, like, having breadth in experiences, which for me, that’s what I think about curiosity, which is, if someone’s talking to me about a book they read or this really interesting change that they just saw on Amazon site, like, how observant are they? And so I kind of almost start, like, a very human dialog with people learning about their interests, and you can start to figure out the things they’re sharing with you what’s the level of detail that they’re actually sharing those insights on, and you can start to figure out if this is somebody who, like, when they see something interesting stops what they’re doing and learns more about it, or if they just take a cursory glance at it. It is a little bit qualitative, but it’s something that we’ve implemented into all of our, like, product recruiting processes.
Brian: Cool. Thanks for sharing that. And do you—what’s the effect on the product that you’re seeing as a result of that style of hiring? Can you observe something different about how you guys are making product for customers as a result of that focus?
Jonathan: Yeah. So like, kind of, almost obvious answer is, I don’t have to be as involved. [laugh]. Which I think is, like, for any good leader or management, that is, like, the primary measure of success, which is, you know, I’m, like, product-focused entrepreneur, and so I have the disease of thinking nobody can do it as good as me. And when you start to hire more curious people, and you have those moments where you’re like, “Huh, shit, that’s pretty good.” You know, you start to have more of those than not, it’s like, kind of this, like, good indicator that you can step back and invest your time into other areas that might need more of your time.
And so, again, it’s not like in an analytics report measure, but, like, a good leader, those are memorable moments, you know what I mean, where they didn’t need to be involved in something, and the result was like b-plus or higher. That’s, like, what I live for these days. And we’ve seen more and more of that, as we’ve increased the quality and curiosity of the people we’re hiring.
Brian: Tell me—I’m going to flip the perspective on this conversation a little bit, since we’re talking about data here, and I tend to think of most analytics is ultimately it’s about decision support. So, am I being enabled to make a more informed decision? So, I’m curious, the gut runs strong for us, especially with leaders. When was the last time that you use data in a decision about Apptopia’s business? Maybe you could tell me an anecdote about your gut wanting to go right, the data is telling you to go left? How does data fit into the way you run the business about Apptopia in your own leadership? Or does it? [laugh].
Jonathan: It happened this morning, honestly. It literally happened this morning where somebody told me something and I, like, more or less was like, that’s bull—like, I’m calling bullshit, right? And then they pulled up the data and they showed it to me. And what it did is it stopped me because I had an assumption based on, like, something simple, which is, like, the growth rate of certain sectors of our business. So, if you, like, break our business down into different personas, my understanding was that one of our personas was more stagnant.
And, like, the data did not dictate to that, and so we’re having, like, resource planning meeting, I’m saying, like, “Let’s pull back resources a little bit.” And they’re showing me data that’s saying my assumption on that customer segment is actually incorrect. And so, I think it’s kind of—I don’t remember the root of the question, but the point is, is that I think entrepreneurs and passionate people need data more. [laugh]. Because I’m so passionate as an entrepreneur and I have so much conviction in my decisions, I’m more likely to make bad decision—that’s not fair.
Like, theoretically good entrepreneurs should have good instincts and you need to trust those, but what I’m saying is, you also need to check those. Like, it’s okay to, like, make sure that your instinct is correct, right? And one of the ways that I’ve gotten more mature is by forcing people to show me data to either back up my decision in either direction and like being comfortable being wrong, [laugh] I think is pretty important there. And I am wrong at least half of the time with those things so.
Brian: Well, and being able to be open about that you are subject to being wrong, and that you want to be informed, I think is half the battle because of ego. I think a lot of these places, it’s ego, and people aren’t ready to accept the fa—well, we’ve done it this way for ten years, and for the first time, we have data to tell us about how we’re doing. And it’s like, “Oh, my God.” I can understand from a human level, it might feel like, “Wow, I’ve been totally steering this ship the wrong way for ten years.” And it’s like, that’s a tough thing for anyone to swallow, but like I think a good leader makes a decision to say, “I was wrong. New information here. We’re going this way now. It’s time to turn the boat.”
Jonathan: And if you look at some of the leaders who you’d respect the most in the business world, they don’t have a track record of just consistently doing the same thing, right? I’m sure somebody in Apple was like, “Why the eff would you ever build a phone? Who are you?” Right? “Like, that doesn’t make any sense. Like, you could show me as much data as you want, but like, we’re not a phone company, right?” And like, yeah, I think you need to be comfortable making different decisions, otherwise, data or no data, your longevity is probably not there, you know what I mean? Unless you’re lucky.
Brian: Yeah. Yeah.
Jonathan: Very lucky.
Brian: Jonathan, this has been really great conversation. I have just a couple final questions here. The first one is, any closing advice for data product leaders? You’ve had some wins even if you think you’ve had [laugh] some missteps along the way. I think you guys are doing some great work over there. Any advice you can pass on to aspiring data product leaders like yourself, or current ones who may be struggling a little bit on the adoption piece, the revenue piece, making the experience easy?
Jonathan: Yeah. My advice would be, think about how mad you would be if the car mechanic asked you how you wanted to implement installing your new brakes.
Jonathan: And like, just be more opinionated. My answer is be more opinionated and—just be more opinionated because your customers are probably coming to you for help or insight or expertise, and by building something more generic, you’re not increasing your TAM, you’re just diluting the value of your own intellectual property and expertise.
Brian: Yeah. I think that’s really sound advice. I totally agree with you on the opinion thing. And you’re probably going to also attract people that resonate with that opinion as well, and maybe you lose some people along the way, but if you’re, like, hitting home runs, with that cohort of people that get why you doubled down and you made this one thing, really easy, you’re useful or valuable, you’re probably on a better track there. I always like to ask this question at the end of the show—and I want to get your contact information and how people can find you, but is there any question I didn’t ask you that I should have?
Jonathan: Um… [laugh] I guess I thought you were going to ask me what analytics tools we use.
Jonathan: But uh—
Jonathan: Yep, yep.
Brian: Wrong show. [laugh].
Jonathan: Yeah. No, yeah.
Brian: That reminds me of the drum clinics. You know, I’m a drummer and it’s, like, Chad Smith, Red Hot Chili Peppers. I’m going to [pace 00:38:15] at conference. You know, there’s 4000 people, like, for this drum clinic, and, like, you can ask Chad Smith anything you want. And it’s like, some kid’s like, “What kind of sticks do you use?”
Brian: It’s like, we got him for 40 minutes. You know, this guy’s on stage for 40 minutes, and he’s going back on tour. And it’s like, “Really? That’s what you’re going to ask?” [laugh].
Jonathan: I’m happy you didn’t ask it to me, for what it’s worth. Like, I thoroughly enjoy this dialog way more. Yeah.
Brian: Good. No, I’m not the tools guy anyway, so. [laugh].
Jonathan: Good. Good.
Brian: Awesome. Well, where can people find you… on the internet? Anything coming up we should know about Apptopia? Anything you’d like to tell our audience about how to be in touch. What’s the best way?
Jonathan: Yeah, I’m firstname.lastname@example.org, and we are kind of competitive intelligence geeks, so we’re happy to talk about that stuff all the time. And even if it’s not your thing, we have a meaningful amount of free content that we distribute weekly, just on observations we’re seeing in this space. And so yeah, we love data, so happy to talk to any other geeks out there as well.
Brian: Excellent, Jonathan. Well, thank you so much for making the time to do this. It’s been great to catch up with you and see where you guys are at. So, congrats and good luck in 2022.
Jonathan: Thanks, man. Feeling is mutual.
Brian: Excellent. Take care.