Transcript | Bringing the customer closer to the decision with AI | Pierce Healy (Co-founder & CEO @ Zelta)
The transcript from my podcast with Pierce Healy
Pierce Healy 0:00
I think if we zoom out and think bigger picture, we're building obviously on top of L lens, but we also embrace them ourselves from engineering standpoint. So use obviously, the copilots we use our engineering team uses tpz every day, just rolling them out. Coding, I would say we've easily increased our capacity 10x In the last six months from these goals, it's really, really genuinely extreme. And I think we're obviously in a position to do that because we're you know, we're, we understand the models we're building on top of them. Projected like three to five years, it's pretty obvious that every technology company is going to be at that level as well. Right? And we're gonna see this at least 10x improvement in speed across the board. The thing about it is like, if you're moving 10x Faster, in the wrong direction, yourself any favors.
Max Matson 1:00
Hey, everybody, welcome to another episode of future product. today. My guest is Pierre Seeley is the co founder and CEO at Zelta that AI is the AI product that helps you understand what your customers want without sitting through hundreds of calls. Pierce, would you mind telling us a little bit more about yourself and your journey in tech before?
Pierce Healy 1:16
Yeah, absolutely. Thanks for having me on max. And so as you mentioned, CEOs Elta we started the company last September, so pretty early stage, but we're hitting a real need for the market. So an exciting point. Before starting the company, myself, I was a strategy consultant in McKinsey. But before about Deloitte most recently, for majority time, like consulting, serve technology companies and really help them to use voice of the customer. And to build better products make more targeted marketing campaigns generally drive more growth and customer satisfaction. And so a lot of that background really bringing into what we're building itself into my pathway, probably serving beer companies would sell that we're serving kind of 500 ish scale companies. So slightly different nuance to the problem. And but really, the big picture is
Max Matson 2:09
very cool. You, you mentioned the term voice of the customer, would you mind explaining kind of like what that means to you?
Pierce Healy 2:15
Yeah, absolutely. So I mean, different scales a company, they're gonna have different kind of interaction channels with their end users or customers, and typically speaking, and will analyze datasets like support tickets, survey feedback, we're starting to see probably with our most recent customer, the early stage companies, there'll be recorded sales and success calls. And basically any touch point where the customer is something that we want to bring into delta, and then use it to inform product strategy.
Max Matson 2:43
Perfect. Yeah, I've heard a lot of people expressed a need for what you guys provide. So I'm sure people are gonna want to hear a lot more about it. But just to kind of take a step back real quick. You consulted for some pretty big names, right? McKinsey and Deloitte. What motivated you to go from, you know, kind of giant consulting firms to building an AI product?
Pierce Healy 3:04
Yeah. And so I mean, kind of specific problem that I saw on the ground was the consulting is frequently served, the Head of Customer Support and tech companies is my client. And really, they would be mostly interested in finding out from their support agents support tickets, where there were the opportunity to save costs. So we'd use machine learning to analyze tickets and look for the cost of resolution for different types of issue the efficiency across different agents. But they also play this role of basically being a heartbeat for the rest of the organization. And so serving open sights to product and marketing teams on kind of what the customers top issues were at any point.
And then really, I saw like friction on both sides of that scenario. So support, basically not wanting to do this extra activity that they don't get rewarded for in sending data to product. And probably the marketing being frustrated that they couldn't really just kind of self serve on the data that they needed. And the systems of record and sport like and the kind of big names everyone knows about their setup to basically give the head of schools for sport, the data he needs, and not really who interested and given product and marketing and what they'd be looking for. So that was kind of brain I guess, blind starting delta. And you know, the problem was that I saw the market. I did have some experience, basically, building a startup within McKinsey, as we call it very different to a real low startup, basically building a software product within within the firm. And it's one of the things that McKinsey, especially over the last five years have tried to do is differentiate some of their offerings by building software products to kind of insulin of the more templatized analysis. So I had some experience building at a product within the firm and found that there was a re energizing experience for me and Have I kind of wanted to do in a bigger way? And I guess I probably always had ambitions to do startup but hit the pressure point last September Fresco and and so often with boxes,
Max Matson 5:13
right on right on would you mind talking a little bit more about kind of that product that you built within within the consulting firm?
Pierce Healy 5:20
Yeah. So really, you know, consult consulting, especially strategy consulting is literally taking data from customers and generating insights from it. And one of those categories that we build the product brand was basically a tax strategy automation product. So we would take survey data from customers survey data from internal people, and combine that with some financial data, and essentially kind of somewhat automate a technology roadmap, from our perspective, it would never be the kind of end result but at the start kind of conversation point for our clients to work around. And then so the building that out within the firm was kind of engaging with stakeholders across the globe to get input on that. We ended up getting a live with hundreds of clients, and it was a key piece of all the engagements that I was working on.
Max Matson 6:13
Perfect, very cool. That's glad that we set the stage so kind of moving to CELTA, right, would you mind explaining just in broad strokes how your product works?
Pierce Healy 6:24
Yep. And so the big picture is really to kind of build this customer insights platform that can be used across the full organization. But today, we specifically focus on selling to product teams, and SAS as kind of the obvious, most engaged user that we go after today. And for that user, especially within a sales lead organization, sales and success calls are probably the key thing that we analyze. And so automatically pulling product insights out of those channels. So for example, from sales calls, we're gonna pull out product gaps that keep coming up. And there's reasons for people not to buy your product, feature requests from existing customers, and automatically some competitor analysis where and it's coming up on calls when people are comparing your product to the strengths and weaknesses, though there's already kind of at a high level of getting three things that a pm one is, in most cases, they are attempting to do this manually today. So we're on task. Secondly is basically alerting them to churn risks and growth opportunities that maybe wouldn't be obvious with being an all these calls themselves.
And 30, then just giving a Source Truth that go to market and product and align on is basically the top customer priorities. Because a typical challenge that most software companies face is that you'll get a different opinion on the top customer needs from say sales and product. And incentives are not exactly aligned between those two groups. A very kind of simplistic level sales kind of think about winning customers and product thinking about winning markets, which causes this friction of like, what the top priorities might be. And so with customers who are helpers use delta, have a source of truth for the first time on like, what the actual sub top customer needs are, if sales and product are having a conversation, they can point it out that as being like 111, quantitative way of saying, you know, what, the top user?
Max Matson 8:25
Got it. Got it. That makes sense. So it sounds like there's kind of a reactive component, which is what people are saying kind of the feedback that they're giving kind of across these distributed conversations, and then be kind of the signals that you're picking up on and kind of prompting to say, hey, there's a turn risk here. Would you mind talking a little bit more about that second piece, the proactive kind of signaling?
Pierce Healy 8:47
Yeah, I mean, for us, what that means is calling it specific cancers being triggered risks to start with segments or personas that maybe aren't, aren't responding as well to the product offering today on the on the growth, so we can we can call out. First of all repeated product gaps to keep coming up with or maybe losing those deals, or even scenarios where maybe a sales rep is promised something that's not in the product roadmap, if the goals were to sign off, they're probably not going to happy. So a bunch of those product specifications that say is like definitely keep these products. And then alongside that the same data that we're tracking is also a repository of information so that when the Pm is buying a new feature, they can go into delta and see all of the use case data they already have from customers.
And then if needed, they can follow up with the customer from there, you know, they already have a list of people who are probably going to be interested in nothing. And I would say today, like I said, our buyer is the product team. But even at this early stage, we're seeing some interesting use cases for the for isation. So sales and marketing obvious ones like a feature of those some customer feedback We want to close the loop and and move it's a prospect that a sales rep is speaking to. And they've brought her up on a past call mentioned to them that that's now being planned for some point in the future. And also personalizing marketing campaigns and and so instead of sending a generic product of that email that's probably completed email that's personalized to the end customer. So you know, you asked for, we looked at your feedback last quarter, and you said these things, and now we're actually building that, which is a lot more engaging, and you know, from a cost perspective shows that you're actually listened to them?
Max Matson 10:34
No, absolutely. Absolutely. And that is definitely something that I've seen product teams struggle with in the past, right. When you guys were designing the product was did you have that in mind that kind of like land and expand type of motion with, you know, having a core with product teams, but then also being able to plug into sales, marketing and provide real value?
Pierce Healy 10:53
Yeah, to be fair, we did. And that was always the big vision, I think we were surprised how quickly that came about. And specifically, because we've really designed the product to have a lens on the data that a PN would be interested in. And we've allowed our customers to basically add as many users as they want, with a fixed prices for 12 months. And that's created some interesting scenarios where the bonus getting used already in those different use cases before we expect. And so it's something I still think is kind of putting more longer term we have a lot left to do to just make sure that our current customers lol so we'll be focused on product for for the foreseeable, but yeah, I mean, it's it's all of us already that, you know, that's going to be where we end up eventually.
Max Matson 11:37
Right on, right on. I love that kind of beating, building those feedback loops into the product. Right? So you're you're hyper focused on product teams. And I've seen that that's that tends to be kind of like the hallmark of successful products in this industry, right is you have one core customer and as valuable as you may be to others that, you know, working in marketing, I can tell you how difficult it is to balance that messaging when you're going after multiple ICPs. Right. So having that core of of providing value is is everything. Now, when you were working in consulting, you said that you had you kind of come across this problem before, right? Where product teams are doing a lot of the stuff manually? What does that process actually look like tangibly?
Pierce Healy 12:19
Yeah, so I mean, there's probably three, three scenarios. One is that the individual PMS are doing it for themselves are basically getting data from support tickets, they're getting data from surveys, they're getting data from salespeople who they speak with, or joining customer calls themselves. And if they're equipped, they might be doing some sophisticated data analysis with that. In a lot of cases, with the bigger companies, they'll have a team dedicated to this. So they can have a product ops team is responsible for triaging all this data. And they've got a research team is doing some of this work. And then obviously, in some cases, they're using consultants on the large scale. And so we'll see kind of across the board. And it's it's a, I think, from our from our standpoint, the the ideal scenario is companies who want to do this, but today don't have a process and we get those kind of build before them.
Max Matson 13:11
Know, right on. Yeah, I would imagine, you know, budget for consulting, or just pure labor hours aren't available for a lot of teams who would like to really capitalize on that feedback a little bit better, right? So I would imagine with all of this data in the form of language, right, MLMs are kind of the defining force here, would you mind explaining a little bit just under the hood of how you're working with LLM to make this possible?
Pierce Healy 13:36
Yep. And we work it's we're working with like a fleet violenza would say, they all have their different benefits for different tasks. You know, GPD four is like the most powerful, but it has a shorter amount of data that you can get it to a prompt. And so for different tasks, we use different models and constantly kind of iterating and not stuck, I would say. But the language models are definitely are definitely at the point now for just kind of resisting. For the first time, we can take this like very unstructured data and actually kind of quantitatively visualized on top of it. I would say in our case, one of the think lucky things in our history was the latest models from open AI. So 3.5 and four coming out January was right around the time that we were actually going to building our product. So we were able to take advantage of those and specifically for call transcripts are probably the noisiest data set you could possibly work with especially especially sales calls. From a PR marketing standpoint, there could be you know, like 1% or even less of those calls is relevant and that noisiness made it quite difficult to do anything with those transfers before you and like six months ago. And so when we first started that Zelda it would have been quite challenging to do what we're doing today, even like last September, and I think other results some of our Our competitors actually ignored calls as a starting point, they went after all the datasets. So if you think about like support tickets, surveys and online reviews as the other datasets you might consider, they're much more structured in comparison. And the signal to noise ratio on those is is much lower. So are much harder. Sorry. So basically, we had the opportunity is like, called a late comer, but even our competitors are all pretty early as well. But we had the opportunity to basically go after the piece of the market they've been ignoring. So that's what caused us to focus on calls.
Max Matson 15:38
Got it? Got it. So having, you know, just done a little bit of transcription through AI for the podcast, I can say, it can be a bear, right, it can definitely be challenging to to actually get it accurate. Were there any significant challenges around that? And if so, how have you guys dealt with that?
Pierce Healy 15:54
Yeah, so it sounds like um, like from from the standpoint of like work McCall's, we're plugging into the existing systems that hold them. So we'll that will pull data from the likes of Gong, fireflies, etc. So the sales success teams that are already using those and the product research team will typically have their own solution for recording calls. And so we're not doing the species X ourselves. That's kind of a solved problem that we don't really need to do. And we found like the language models are incredibly good at dealing with with poor transcripts, much better than human actually, like they can deal with spelling mistakes, they can deal with words that have been mis transcribed by just kind of understanding full context. But if you ever do read transcripts, it's actually quite a taxing thing for your brain to read, because they are mis transcribed, don't have fun, for sure, the language models are very good at dealing with that, and basically, still retain the context. I'd say the challenge in this is, is more so with the volume of data really. So if we take one example GPD for individual prompts, can take maybe a quarter of a phone call, and will be the maximum that you could push into the API, the beta version of dp four, the 32k token version, that's not out yet. That will be like one phone call. And so basically, no point in the next two, three years, do we expect a scenario where we pass hundreds of calls, which we're doing to the API and get a response? And for that reason, we've had to kind of be very creative in our solution for what we're trying to do. Because we're not just trying to do individual analysis on calls, we're doing this aggregated view, like what are your customers talking about, aggregate across all these issues? So is a challenge that we're solving in steps, we have a solution that works for say, 50 calls, and then we the solution number for 100 calls and not breaks a 200 calls? And that'd be an ongoing problem as we try to scale this up. And but I guess the nice part for us is that, you know, tough challenges mean that if you solve them, you're, you're building. So it's a nice problem.
Max Matson 18:04
Yeah, yeah. No, 100%. And it sounds like you guys have a pretty good head start in terms of even doing this in the first place, right with the call transcripts, which I can totally see how valuable that would be right. So what are kind of some of the outcomes workflows that you see sell to enabling for product teams? Right, well, what is the grand sweeping kind of goal for this? Yeah,
Pierce Healy 18:25
I mean, the most, the most obvious one will be justifying the roadmap, which is like essential for any product team. And today, like Carlson big challenges, so if something is on there, and it's pivotal, if they're able to explain, like why that has been prioritized. And there's a bunch of different datasets that are used to do that today. But it's, it's, it can be a political battle, because everyone has a different opinion on what the top priority is. So we'll see probably the biggest, immediate kind of use case for delta is like, our customers are copying the insights from our products and pasting them into the roadmap is like, We're doing this because 86 customers asked for it last month. Go look at the data yourself. We don't believe it. Yeah,
Max Matson 19:06
no, I love that, right. I mean, something that I've talked about with a couple of other founders who are in the data space, specifically, right, as it's kind of pulling all the data together, right? So we've had these distributed islands of data across different departments for a long time. That's kind of the norm. But today, what I'm seeing is a lot of that data centralizing and then basically data islands reaching in and kind of being able to pull from that data, right. So what you're doing is arming the the non technical people, the people who are just advocating for the voice of the customer to say, hey, here's some empirical data on what our customers are actually looking for out of our product.
Pierce Healy 19:43
Yep, absolutely. And I think what's, what's critical to this is, you need to be pretty opinionated in the questions that you're answering and the way that you analyze that data. It's not necessarily a data set that you can, you know just combined together and give complete flexibility to the end user on anything they might ask? So for us, it's like the, the insights for generating are, are pretty specific to like the the goals for our we're solving for. And we've, that's obviously only we continue to iterate on what it's I think it's in any any given use case here of a large language model within an enterprise. You need to really know what use case is all important to do well,
Max Matson 20:28
no, absolutely. Yeah, it's crucial. It's crucial. So kind of extending that point. Are there any kind of practical success stories that you maybe want to share about zalto? That, that you've kind of enabled thus far? Yeah.
Pierce Healy 20:41
And so I mean, our quote, our core customer set is at least set a sales lead SAS, where, you know, a lot of the things that we're recommending are pretty mundane, like, integrate with HubSpot, or do XYZ, right? Nothing. Nothing too interesting to most people. But I'd say one interesting example we've had, actually from one of our earlier customers was a media company, who's like, last our target market now with early on. And, and an interesting example, there was the AI it learned from the customer interactions that their product was getting used by a lot of people as essentially a replacement for their feeling lonely, essentially, in the product. They found like the product was like this animal character based media products. A lot of people interact, but were basically like, lonely, and they were finding that like, interact with the characters was like, you know, helping them feel better about themselves. On the AI learned that that was like a key need the product was solving for the end user, it suggested as a recommendation to basically create interactive versions of these characters using generative AI. Basically, allow the end users to actually interact with the characters they're watching on these in these snippets, and in a news, generative AI to do it, and I think what was interesting with that is obviously it was a really like, cool insight, but also the fact that the the AI is kind of suggesting it should reproduce itself. Yeah. Its own capabilities, it was applying that to what it learned, which is like somewhat of a scary thought. But it was cool to see and like for the customer was actually like a really incredible idea and almost lead to like a full like, pivot of their company in that direction. So that's one cool example in gift. Yeah,
Max Matson 22:43
that's fantastic. Man. I yeah, I mean, there's a part of you that wants to question the models, motives there. Right. But how do you? How do you go from, you know, insight to recommendation? That seems like quite a step. And it's pretty amazing that you're able to actually make these kind of high confidence recommendations?
Pierce Healy 23:02
Yeah, I'd say like, our starting point was, like, let's give you data and like making major decisions, we've transitioned in the last couple of months to being more recommendation focused. And I think it's, it's something that we push less for customers, because ultimately, like we're selling to the people who should be coming up with the recommendations, and it's a starting point, at least, and it gives you an off the shelf, like, here are some things that we're seeing your data, right. Basically, on the recommendation side is probably the part of the application where we're letting the AI run loose the most. We're giving it basically the most flexibility to just to this kind of uses brainpower and come up with things. So it's more blackbox than other parts or applications that are very specific. And what we've seen already great responses with antibodies being in the recommendations that comes up with are things that the company is already think of it. This might not be in the same order, so and more like here's the top 10, based on like, what would you think the expected customer interests could be? And the things they already know about? And but that order of priority is an interesting input to that conversation?
Max Matson 24:11
Yeah, no, absolutely. That's super interesting. You know, I It's a recurring theme, definitely that AI is is seemingly best used with interesting or novel datasets. Right. So you guys have found this very novel dataset that you're really the only competitor in the market. That's, that's looking through, right. And then I think that it kind of nullifies some of the concerns around the blackbox when it's it's purely an output of the data. Right. So the first issue is getting that high quality data.
Pierce Healy 24:42
Absolutely. I think the other thing is, at this stage of AI, at least the use cases that make sense are ones that have a a low impact of bad decisions being made. Like if we make some product recommendations, and they don't tell A golf. I mean, it's that's just as expected for any software company, right? And scenarios where the decision needs to be 100%. Correct. I think AI today is probably less suited. And so, you know, the obvious ones being like, let's say legal or, or health scenarios. And but I think where we are today with AI what we're doing at least that it's a, you know, there's there's there's not there's not too much of a risk in taking the AIS recommendations. Right.
Max Matson 25:28
Right. Right. And it sounds like a lot of the time that kind of corresponds with what they're already thinking, it just kind of helps rank order that by what's actually heard by the customer.
Pierce Healy 25:36
Yeah, like within our ranking as well, like considers, like the revenue potential of the sad things. I think this whole thing is like, a nuance of selling to sales, like go to market companies. They don't think about users as individuals, I think bethlen Being belonging to an again account, right. And so we're making recommendations, not the usual it's up the it's taking the context about the company level. So if a company has a really high revenue potential, we want to push that higher up in the in the ranking, and all that's considered in the weighting of these recommendations.
Max Matson 26:08
Right on. So this is a tool that is made for product teams, right? It seems like you guys have a very tight product focus, how do you see kind of tools like delta, and some of the other ones that are coming out right now, changing the role? You know, as we go forward? Two years, three years?
Pierce Healy 26:25
Yeah, I mean, our ambition, really, to be honest, is to is to bring the customer closer to decision, the product. Now, the intermediary for Not Today could be product teams, where we're giving them the data to do that. But let's say the problem that we're trying to solve is really just the multiple middlemen that is between what customers are saying and what's actually getting built. I think if we zoom in and think bigger picture, we're building obviously, on top of our lens, but we also embrace them ourselves from engineering standpoint. So use obviously, the copilots we use, our engineering team uses tpz, every day, just rolling them out. Coding, I wouldn't say we've easily increased our capacity 10x In the last six months from these goals, it's really, really, genuinely extreme. And I think we're obviously in a position to do that. Because we're you know, we're, we understand the models, we're building on top of them, like three to five years, it's pretty obvious that every technology company is going to be at that level as well, right, and we're gonna see this at least 10x improvement in speed across the board. The thing about it is like, if you're moving 10x Faster, in the wrong direction, and yourself any favors. In a scenario where every company can do tasks faster, it puts a lot more emphasis on making sure that you were actually going in the right direction. And so we seize all the fittings in that world of we have we have theoretically infinite engineering capacity. Let's just make sure that we're building the right things.
Max Matson 28:08
Right on So would you say that it's still, you know, at least for the foreseeable future, gonna be humans taking it from zero to one, even if AI is able to scale it from one to 100?
Pierce Healy 28:18
You mean, in terms of like actually taking decisions? Who's Alpha implementing? Yeah, I
Max Matson 28:22
mean, I would say like what you said, right? Sprinting at high speed in the wrong direction, I think is everybody's fear, right?
Pierce Healy 28:28
Yep. Yeah. Yeah, I mean, it's so like the like, definitely, for the foreseeable, it's going to be, you know, there'll be humans taking the outputs and sell them and still prioritizing those based on what they what they know. And at the moment, at least, we are, you know, a subset of the overall decision. There's other factors involved, as well, right, the company's overall strategy. There's other datasets like, you know, interaction data that you want to consider as well. So, longer term, we'll probably try and take more of that pie as well. And we're one input. Right.
Max Matson 29:04
Right on, how do you kind of, you know, I've seen a lot of people question, especially with data products that utilize AI, how they can trust and have high confidence in the recommendations and the insights right, how do you guys kind of back up what you're saying there? Yeah,
Pierce Healy 29:21
so what we've had to do is basically every single insight and CELTA gets links or setup. So every feature request is backed up by specific quotes that you can click through and see the original call or the original support ticket. Every recommendations points to explanatory source that are for why come up with that. And that's definitely essential to people being able to use their product and believe it and also from the use case of like just to find the roadmap, the ALA saying we should do it is not enough. We have to get the actual cost resource data on it's also probably the biggest challenge for us from an engineering standpoint, is maintaining that link. Because I think one of the things that l Oh, no, don't do at the boxes like say, this is how I came up with this. Right? So it's quite a lot of work for us to actually like make sure that it's giving insights that come from source data. So,
Max Matson 30:13
yeah, no, I can definitely see that. But it's, we also have a, an AI product that works with product data, right. And it's something that we're still figuring out, and I think a lot of people are is the concern that if you ever give a wrong answer, that trust could be gone, right? And I understand what you guys it's a little different because there's a rank order, and you know, you're linking to the data directly. But how sensitive? Do you think that, you know, users of AI products today in kind of these, these early days should be to incorrect data, right? Is that a reason to abandon a product?
Pierce Healy 30:49
I think that if you can't, so the L ends have a massive propensity to hallucinate and make things up. So if you have a product that can't show its work, I would have a very hard time believing anything, it says To be honest, because it's a very high percentage do that. So critical roles is basically putting guardrails around that I think, you know, up from a model standpoint, I'm sure it's something that they are focused on is like how they solve their problem. It seems like today, at least, the LLM is like rewarded so much for this giving an answer that it will never say, I'm not prepared for this, make something up and then say it doesn't, doesn't know. And so it's a problem for sure.
Max Matson 31:39
Yeah, no, absolutely. So So pivoting slightly. Is this is your first venture, right? Like your first startup founded? Wow. So as a co founder, you know, what has that journey been? Like for you? Are you enjoying, you know, founding a company?
Pierce Healy 31:55
Yeah, I mean, 100%, I think, I think for me, and like, especially as CEO, it's like, this opportunity to put all of your tools to work, right? Like, you're not just doing one thing, you're doing 100 things and like that, for me is the most energizing thing. And also, it's a it's, I think it's, it's, it's, it's the most clean competition out there, right? You're competing with everyone else. It's like, at the end of the day, it's like, whoever builds the the most customer builds revenue generating company wins. Well, you know, that's, uh, you know, you could, you could say there's advantages that people have, but at the end of the day, it's meritocracy. And for all of us. That's, that's a, that's energizing. So, yeah, I've loved that.
Max Matson 32:39
Good to hear good to hear. And you have a co founder? Yeah. Would you mind talking about kind of what your process was for bringing on a co founder? Yes, there's
Pierce Healy 32:47
actually three of us three members. And we, we all actually known each other a long time. And we're from Ireland originally. And then save, and we were in the US through different paths over the years. And so on the center points of the three I recruited Connor and Michael, two best guys, I knew I knew for that for basically two roles that I needed to fill there. So I kind of get most energy from working on products and like, from left my own devices and not doing all the other CEO activities. That's where I spend my time. And it was important for us from early stages, I felt to build a sales lead motion, right. And then coke one of our co founders, Connor was previously leading sales for a startup. And so the customer insights product. So similar seldom is there from zero to 20 million arr. So for the full spectrum, and then like our CTOs and a heavy like data ml background that a key thing for our products is like we have quite a lot of complexities when it like paralyzation across these datasets, and he's worked at extremely large scale before. So we split up responsibilities pretty clearly in that regard. And I think one of the things I would probably say we've definitely learned and would be advice them now starting a company is that roles and responsibilities and a founding team are very important. I think we were in a nice position that we kind of felt like we had our own ways, and we get a kind of all those, but particularly important if there's overlap there that you know, everyone knows which part they own, because that's a essential to make decisions.
Max Matson 34:23
No, absolutely. I love that. Right. And it's it's kind of augmenting what you're best at with, you know, the people that you trust that you know, can do kind of the other things when, when you need to get you know, just go in your product hole and really focus in right. Yep. Yeah, so let's talk a little bit about kind of GTM right. You guys have a really interesting kind of ICP. Are you all would you consider yourself sales lead?
Pierce Healy 34:51
Yeah, I mean for both at least. Obviously, early days, we're very founder lead on Salesforce. Our go to market is picked up I'm only open for most of the moment. But we actually recently joined the gangs marketplace, which has been surprising in the channel already got a couple of weeks, basically, referrals we've gotten, we've actually had two cultivators get referred from that, which is zero acquisition costs from our stop, right? So classic Aven was probably 20 Other marketplace like ours, and we think we'll be good. We'll get this assembler results. And basically, the the, the synergy there is we're extending those systems of record to other teams within the company. So excited Gong within lead support, Zendesk intercom, same scenario. And basically taking those datasets that those those products collect and using them for product marketing teams.
Max Matson 35:47
Very cool. Very cool. I that's a great example. Right? Would you so if they're coming in through Gong right there, they may be more on the sales marketing side, then in that case,
Pierce Healy 36:00
typically speaking, will actually be product people, within companies that have gone, who are frustrated by the lack of analytics and gone for their needs. Gotcha. And then if you go into the gong Help Center, a search for product Analytics, you get brought to the sales page where they come to us, so
Max Matson 36:19
yeah, so kind of your ideal customer there. Right.
Pierce Healy 36:21
Yeah, know, exactly. They're pre qualified. coatings, right. So yeah,
Max Matson 36:27
yeah. It's amazing what partnerships can do. Especially I think, in those early days, as you're kind of transitioning from founder led sales into you know, like a more sustainable sales motion. Right. What have your guys's biggest challenges been thus far? I know that you're pretty young still. But just to this point.
Pierce Healy 36:46
Yeah, I mean, technical challenges, I would say, biggest one that we hit every single day, and we're continuing to hit. I think there's a massive misinterpretation or under underestimation of what it's like overestimation, like what you can do out of the box with models. But I think there's like within the app layer for elements, there's a spectrum of technical difficulty by use case. And it's, it's a, it's not equivalent, like on the on the one. On the one hand, you have kind of like, you know, pure wrapper companies who were passing an input to the API and getting response. And then on the other end, you have scenarios where you're using data to doing some analysis on and using that to drive workflows. And even within that use case book, if there's a spectrum of the difficulty level. We're definitely on the outer end of that for what we're doing. So it's, yeah, constant tactical challenges, which are, you know, somewhat puritanical nature and somewhat like, how do we how should we build this from the standpoint of the use case that we're trying to sell for? So we spend, I'd say all of our time as a team brainstorming on how to basically deal with that constraint and come up with something that works.
Max Matson 38:00
Gotcha, gotcha. If you were to kind of give some advice, just given your experiences here some of the technical challenges that you've faced and overcome in order to build Delta, what would your advice be for, you know, new entrepreneurs who are hoping to enter the AI space?
Pierce Healy 38:16
Yeah, I mean, I think it's, you know, it's probably pretty standard advice. But it's to take the MVP route, because it's a, there's, you can spend a lot of time building a really scalable system for a problem that doesn't exist. And certainly, in our case, we started with something that worked for a very small amount of data with a very constrained min of output. And we've just kind of iterated from there. I think the same is kind of will be true of any software category, but definitely here because there's a large difference in, in in the applications, you can build a cup of MVP capacity versus what you might build in production. So.
Max Matson 38:58
Absolutely, absolutely. And what what would be some guidelines that you would kind of set for for when somebody should know that they've reached that MVP?
Pierce Healy 39:07
Yeah, I mean, we have from the get go really pushed to charge over customers. I think, you know, there's a lot more pressure generally in the market today to have revenue generating products. And, you know, I know, there's so many products that I use that if they ever charged me, I would stop using them. You know, you can you can kick the can down the road, too far on that points and dope, you know, in a scenario that you're not in position to pivot. And so for us really upfront, we were very proactive with making sure that you know, people were actually going to pay for this thing that we're building, even if like, was duct taped together and not working. So and so that's definitely something that I would say for all at least we we put priority,
Max Matson 39:53
right on what do you view your kind of early customers as partners in that sense then, because I definitely think that You know, from kind of in that model, there needs to be buy in on both sides, right. So A, this is a valuable product, you see the value, so you have to be willing to pay for it and be, you know, as a founding team that's working on this product, and you've entrusted us with it. We're going to take, obviously, your feedback into consideration as we build it. Is that kind of the way that you guys see some of those early relationships? Yeah, 100%. And,
Pierce Healy 40:23
and they all do have a very direct inbox to headquarters built, but I think that's, you know, that's, that's the that's what that's the benefit we can give them for paying the earliest you're gonna get to say and have this built, right. And what you're looking for as a startup is, you know, the the early adopters, right, the people out there in the world who, you know, have this needs so painfully that they they're actually willing to buy something that maybe isn't isn't fully fleshed out yet.
Max Matson 40:51
Yeah, absolutely. I think, I think a lot of product teams probably have this need, even if they don't realize it, right. Yeah, 100%,
Pierce Healy 40:59
I'll say actually, kind of an interesting we've seen like, within the early adopter pool of like people, like resonated the most with, and it's probably actually vertical sauce. There's some interesting nuance to it is, if you're selling a product, you're building a product that gets sold to other software companies, it's pretty likely that your product managers and engineering and someone to understand the need, or the product like a CRM, it's pretty simple, right? You can get on boarded up pretty quickly. If instead you're selling like something like, you know, construction SAS or something, like niche and like Nuance some area, because you're very unlikely that the VMs that you hire from other software companies are gonna have any interpretation of what the needs are. Within COVID, like that, you have a product that is probably not intuitively understood by the product engineering department. So a lot more emphasis on basically customer feedback. Because they need that data to make decisions, they can just kind of like sit in a room and brainstorm and think about, you know, what we would do there? So that's probably the area of America, we found, like the most burning needed and probably won't.
Max Matson 42:04
Make sense? Makes sense, right? It's how close are the people building the product to the need? Right? So if you're an engineer, and you're serving other engineers, that's a pretty easy kind of answer. Right? But if you're an engineer, and you're serving construction managers, right, and and so it's a very different one. Yeah, exactly. So looking ahead, where do you kind of see, you know, AI? And I'm gonna say in general moving and and specifically, kind of product management tools kind of moving?
Pierce Healy 42:36
Yeah, so are someone ambition from the start, and I guess we still think it's a potential future for us is to product managers and people building products for engineers are always told, you know, speak more to your customers like and then more time talking to customers, right? For a lot of reasons, that is a difficult task, and non scalable task. And solving that in reality is just the not a, a luxury that most companies have. So our ambition was to basically give you that same insight that you would get talking to all those customers without necessarily having to be on the goals. And the root of that was by taking the data that already exists, all the touch points already exists and giving you an answer. And then today, we're doing that with this like insights dashboard. But a possible future for us is that we could almost kind of synthetically replicate your personas in an AI format. So you're working on some new feature. And instead of like lining up three interviews, to talk to somebody about it, let's set the AI pretender that persona and see how they might respond to all the contexts we have on them. And then that's something we still think is a you know, a future for us. And something that we will actually build. The starting point to get to that point is basically the data on your customers the context. So once we have enough data from our customers, we think that's something we can do is basically allow a pm to have a conversation in Slack with Delta, their customer, so outsell that to your Slack channel, basically is like adding you're adding whatever customer persona, you're interested into that channel.
Max Matson 44:13
That's fantastic. I mean, I couldn't think of 1000 reasons that would be awesome, right? That's amazing. You guys really are leveraging LLM kind of to their strengths, right? Because it's something that I've put into practice just even with ChaCha PT. I almost think it's a prerequisite to say hey, you are this right, it helps the model kind of differentiate and narrow its focus down quite a bit. Yep, for sure. Yeah. And you know, feel free to answer this to whatever level you want. But do you have any optics on the state of AI and kind of the tech industry today?
Pierce Healy 44:47
Like we see this is herremans Right? Like this one like kind of firm can outperform come out with a felt piece and then all of a sudden, everybody thinks that that's the case, right? Right. Later, something new comes out right? last six months. In a IOC not like taken to the extreme. And I think if you went back like five months ago, a lot of the emphasis was on the infrastructure layer, as they call it. And so obviously, the model there, and then, like the tooling around the models, the standard like picks and shovels, thesis. Right, right. And we were looking at that, and we were like, it seems like just kind of a obviously wrong position. Because like, quite obviously, the models are going to get commoditized. Right, even if opening was that was, was a leader like it was, you know, there was a matter of time, we clearly holders coming up after them like it was it was something that like, you know, I think, before it came out, it was like, chopped CBD came out. And it was like, where the world's changed CBD for me 1000 times better than then the GPD 3.5. And no one really cared would be five?
Because I'd say literally no. Okay, right. Because this is like tailing off of like, of of marginal benefit. I was driving. And I was like, an interesting shot with somebody on this. And there was a good example of, like, 4k televisions, right? It's like, a 4k TV right now. If an 8k one comes out, like is anybody that interested in going in, right? It's pretty good and already cared about. Like, I think you've actually already reached that point today, where most like business tasks is pretty much already good enough, like it's pretty much at human level. So we needed to get smarter, it's not going to actually make any more different, like interesting. For older nice use cases, it has a bigger impact vote for run of the mill, like kind of applications that are doing this, like what we have today is basically go on and off. So it feels like we're already at the point where these models are going to compete on price, not on value. That that will be where things go in future. I think, additionally, the dev tools that have come out, first of all, we were not using any of them, we would try them all and be like, I don't know why we're supposed to be the customer, these dev tools. Not really seeing why we need them versus like things we already have. Yet, they're all getting folded. And we're like scratching your heads about them. And that was playing and I think only over the last month or so I've started this change Narada like people kind of being like, actually the infrastructure bit looks like it's either being commoditized, or just actually not that interesting. And like a lot of like the real value creation from this as like the application layer. Which I think we in six months time I expect to be the dominant narrative not to be like the
interesting copies to all be the out there. Like, I don't know about the heartache or not. But it's,
Max Matson 47:56
that's great. I would definitely agree with you on that. Right. I think that's pretty funny. I wonder if some kind of VC letters are going to come out kind of reiterating what you just said pretty soon.
Pierce Healy 48:08
Yeah, I mean, it's definitely what we're seeing, like, like, from my perspective, like the the, the opportunity to create value with this is if you own the customer, and you own that workflow, stuff that's happening under the hood, the infrastructure, the person who's paying for your products does not care. Right. Right. So
Max Matson 48:31
it's what value Have you created for me, right? I don't care how flashy the tech is, how new it is, how fancy it is, actually think that it's one of the big kind of differentiating factors between AI and crypto, right, which from just a media perspective, have very similar trajectories. Right. But I do think that where AI kind of outpaces it is in its actual usefulness and applicability to all these different use cases.
Pierce Healy 48:55
I mean, it's it's so hilarious that they are even drawing the comparison. Right. Right. Like, we're basically just saying, Here's two trends. And just because they happen to be trends, we're going to say that they might have similar paths, right? Crypto was like we're gonna build a new universe. AI is like, we're just going to make things cheaper, faster. More simple. I like so. Right. Right. Benefits,
Max Matson 49:24
very, very different investments. I would say, Well, I'm peers. This has been so much fun. I really appreciate you joining me. Where can people find you to follow you?
Pierce Healy 49:34
Yeah, I mean, so we're zelda.ai is website and we can also get me on LinkedIn and Twitter. And yeah, we would definitely love to hear from any product teams who feel like they have this problem.
Max Matson 49:44
Perfect. You guys up to that AI. Check them out. Yeah. Thanks. Nice.