May 10, 2026

#597 Dashboards Are Dead. AI Agents Replace the Forecast Call | Laura Fu, GTM Architect at DevRev

#597 Dashboards Are Dead. AI Agents Replace the Forecast Call | Laura Fu, GTM Architect at DevRev
Apple Podcasts podcast player badge
Spotify podcast player badge
Amazon Music podcast player badge
Castro podcast player badge
Overcast podcast player badge
YouTube podcast player badge
Anghami podcast player badge
PocketCasts podcast player badge
RadioPublic podcast player badge
RSS Feed podcast player badge
Youtube Music podcast player badge
Audacy podcast player badge
Goodpods podcast player badge
PlayerFM podcast player badge
Apple Podcasts podcast player iconSpotify podcast player iconAmazon Music podcast player iconCastro podcast player iconOvercast podcast player iconYouTube podcast player iconAnghami podcast player iconPocketCasts podcast player iconRadioPublic podcast player iconRSS Feed podcast player iconYoutube Music podcast player iconAudacy podcast player iconGoodpods podcast player iconPlayerFM podcast player icon

In this episode of The CTO Show with Mehmet, Mehmet sits down with Laura Fu, GTM Architect at DevRev. Laura brings a RevOps and sales enablement lens to a question many GTM leaders are now facing: AI does not fix sales by sitting on top of old workflows.

The conversation reframes AI in go-to-market as an operating model problem, not a tooling problem. Laura argues that AI-native execution requires new feedback loops, better data capture, agent-readable systems, and a different view of enablement. The strongest claim is that dashboards and forecast calls become less central when agents can surface the signal directly.

If you are leading, building, or investing in enterprise sales organizations, this conversation gives you a sharper way to think about AI-native GTM, CRM architecture, RevOps, sales enablement, and pipeline execution.

About the Guest

Laura Fu is the GTM Architect at DevRev, focused on improving go-to-market efficiency and how revenue organizations operate with AI.

She is the author of Designing for Excellence: Sales Enablement in the AI Native World, a book about using AI to make sales enablement and GTM engines more fluid and operational.

Laura is the right person to frame this signal because she connects sales enablement, RevOps, CRM systems, data quality, and AI agents into one operating model.

LinkedIn: https://www.linkedin.com/in/laurazfu/

Key Takeaways

  • AI does not make broken sales processes better, it exposes where the process was weak.
  • Sales teams still move at human speed, but expectations now move at AI speed.
  • AI-native GTM requires workflow redesign, not summaries copied into old systems.
  • Traditional enablement fails when training is disconnected from the moment of need.
  • CRM becomes more valuable as memory and context, not as a manual reporting database.
  • Dashboards lose power when agents can detect revenue signals directly.
  • Poor data quality breaks trust in AI faster than poor user adoption.
  • RevOps teams will shift from analysts to GTM engineers who build and orchestrate systems.

What You Will Learn

  • The difference between AI adoption and AI-native sales execution.
  • How AI changes sales enablement from a training function into an operating system.
  • Why dashboards become less useful when agents can scan signals directly.
  • The CRM requirements that matter when agents need read and write access.
  • How real-time feedback loops can reshape sales messaging, pricing, and positioning.
  • Why data quality and change management decide whether AI tools get trusted.
  • What an AI-first revenue organization could look like from day one.

Episode Highlights

00:00 — Laura Fu frames AI-native sales enablement

02:30 — Sales teams face AI-speed expectations

06:00 — AI adoption does not change execution

09:30 — Traditional enablement was already broken

12:00 — Enablement becomes a system, not function

15:30 — The AI enablement flywheel takes shape

20:30 — Change management breaks AI adoption first

25:00 — Feedback loops separate messaging from delivery

28:00 — Pipeline creation remains the strongest signal

30:00 — Dashboards are dead in agent-led RevOps

36:30 — AI finds pipeline signals faster

39:00 — GTM engineers replace analyst-heavy RevOps

42:30 — Laura shares the book and podcast

Resources Mentioned

Listen Now

Available on all major podcast platforms and YouTube.

Connect with the Show

Follow The CTO Show with Mehmet for more conversations at the intersection of technology, startups, and venture capital.

 

Mehmet: [00:00:00] Hello, and welcome back to a new episode of The CTO Show with Mehmet. Today, I'm very pleased, joining me, Laura Fu. She's the GTM architect, uh, GTM architect at DevRev , and also she's the author of Designing For Excellence book. As you can guess, we're gonna talk about GTM, go-to-market, and in the age of AI, a lot of things are changing.

So without further ado, Laura, this I think I do with all my guests, I keep it to themselves to introduce themselves. So tell us a bit more about you, your background, your journey, and then we can start the discussion from there. So the floor is yours. 

Laura: Sure. Thank you so much, Mehmet, for having me on the show.

Um, I'm the go-to-market architect at DevRev, and so I am just extremely curious about, um, how we can improve the efficiency of the go-to-market organization and what are things we can deploy to make sure that we're operating at maximum efficiency. I've, um, been leading RevOps teams for a number of years now.

Um, and I'm also now the [00:01:00] author of Designing for Excellence: Sales Enablement in the AI Native World, which talks about how we can incorporate AI into helping us, um, make that go-to-market engine even more efficient and more fluid. Um, yeah, that's it. 

Mehmet: Great, and thank you again, Laura, for being here with me today.

Now, let me start where you finished. Now, I know you've led GTM and enablement across multiple high growth companies before, and things are kind of changing today. Mm-hmm. So what is fundamentally breaking right now in sales organizations, in your opinion, that most leaders still are not understanding or maybe they are underestimating?

Laura: Yeah. Um, I think the first thing is that I would say generally speaking, it is harder to become a seller today, um, because buyers themselves are way more informed, and they kind of know what they [00:02:00] want. Um, there is a trust recession. I think I heard that term in your last episode. Yes. Um, and so it's been, uh, difficult, I think, for sellers to, um, gain that trust from the consumers.

The, the consumers are a lot more skept- skeptical. Okay. Um, and then the second thing, um, just like very relevant right now today, is that we have been thinking that AI is gonna come and solve all our problems, and, you know, it's like a button that we press or a switch that we turn on. Um, and that once we deploy that into our go-to-market organization, it will miraculously, uh, increase the productivity of, of the organization.

However, one thing that we do get wrong is, um, you know, we kinda have to just change the way we're doing things, not just deploy it in the same way, and that's why it's not working. 

Mehmet: Right. Laura, now- In your book, you argue that sales teams are moving at the speed of AI. 

Laura: Mm-hmm. 

Mehmet: Um, so [00:03:00] how that looks in practice for if we're talking in pre-series A or B, probably they don't have l- yet a CRO, but probably maybe they have someone who's taking care of this.

Maybe in, like, growth stage they will have a CRO. So how, or, you know, s- so how things are changing, uh, actually for CROs or Head, Head of Sales when they are preparing for their, maybe on Sunday evening, maybe on Monday morning. What is the shift that's happening? 

Laura: Yeah. So, you know, even though I said, um, sales teams are operating at the speed of AI, I think what, what, what I meant was, um, the expectation of them is changing at the speed of AI.

But, you know, sales teams and humans can only move at the speed of humans, even though we deploy the AI. So I just wanted to clarify that, um, because going back to my point, just because we deploy something doesn't mean it, it automatically, like, [00:04:00] accelerates and moves faster. Um, as a go-to-market leader, I think couple of things that, that change is the ability to consume information about everything in an instant, and being able to ask those questions, um, right at your fingertips.

So for example, you know, where we used to wait for the forecast call and do deal reviews, we no longer have to do that because, um, in the past, we used to have to have everybody on the, on the phone, and we needed the, the person who was really into the deal to just tell us all of the context about the deal.

We don't have to do that anymore. If you have a question about it, all we just have to do is, is ask, ask the AI because it has all of the context, it has all of the notes, it has everything that has transpired, uh, between, you know, the first time we, we did an outreach to the current status. You can ask all those questions and identify the gaps yourself.

So being able to have all that access to the information I think is amazing. That's one. The second thing is also being able to have [00:05:00] access into the levers that are currently working, um, or also not working for you. Um, you can do that without needing a team, an army of RevOps analysts to, um, do all that data analysis for you.

Mehmet: Do you think, Laura, based on what you just mentioned, this is where actually the gap is between AI adoption and AI native execution, where most companies fail?

Laura: Ask me that question again. 

Mehmet: So based on what you just mentioned now, right? Okay. 

Laura: Mm-hmm. 

Mehmet: A- a- and I'm talking about the process, getting on the calls and, and all these things and having the analysts. Do you think, like- If, if I want to summarize, do you think that the gap between AI adoption and AI native execution falls here?

Is this where most companies are, are failing?

Laura: Um, no, I don't think so. [00:06:00] I think everybody right now has the ability to summarize, and they know how to use it in that way. Okay. Where I think most companies are failing in AI adoption is being able to actually deploy it in a way that, um, is fundamentally changing the, the process itself of doing sales and being able to tap into the salesperson and the human to be able to execute on that.

So let me give you an example of what- Sure ... that might look like. Okay. So for example, um, we all know that, um, it's very important to follow up with a customer after a meeting, right? Okay. 

Mehmet: Mm-hmm. 

Laura: So, um, if we just follow up straight away, um, that's what we were doing before. We'd send an email and thank you for the conversation.

Here are a few summary points. Okay. Using AI to do that and l- and just layering it on would just be saying, [00:07:00] "AI, just summarize it for me." I copy and paste that, and I put that in an email, or even the AI can send it for you. That's just- Mm-hmm ... um, that doesn't even require adoption. That just requires some kind of like orchestration and setup, right?

And the AI can do it for you. Okay. Where, um, in, in this process, where I think, um, we could think about an AI native process is in the way that you are suggesting you as the AI... Sorry, the AI is suggesting to the, the rep, "Here are some things that you might want to change about your use cases that you suggested based on what we heard."

Super highly customized, and it generates also the, not just the email summary, but the new decks, the new use cases, a demo flow. Um, all of those things related to how are you gonna change the conversation [00:08:00] afterwards. Um, and then having pre-presenting that to the, presenting that to the rep, but also asking the rep, "What do you think?"

And asking for their opinion around whether or not this sharpening is working. So I think, like, when we talk about AI native, it actually needs to make the humans better at their job, not do the human work for them. So in the first example I gave you where the AI is just summarizing and then sending it off, it's not making the human any better, right?

But in the second example, it is actually challenging the human to say, "Hey-" Uh, things could be different. Things could be better and different. I wanna ask for your opinion. Do you think this makes sense or should be changing- Mm-hmm ... in different ways? So I think that second example is more, um, more reflection of what AI native looks like, except that we have been thinking more about just automating tasks versus making humans better.

Mehmet: That makes a lot of sense. Now, mentioning this, we know, [00:09:00] because you mentioned the reps and, you know, I was one of them back in the days. So, um, when any new... So this is what used to happen. So a new use case, maybe a new, uh, collaboration with another company. So whenever something new comes up, we used to have, as you know, the traditional enablement.

Um, yeah, let's train people on the messaging, let them do this, and go. Now, you make a strong point that traditional enablement training sessions and onboarding banks and certification is no longer enough. 

Laura: Yeah. 

Mehmet: Like, where exactly that model break- Yeah ... in your opinion? 

Laura: Traditional enablement has, has actually never been enough.

Um, right? The way that we were thinking about those programs where we deliver it and then we don't follow up on, um, how, how the reps are actually using it, it has never been enough. And, and we've actually always been [00:10:00] trying to drive towards this model where it's just in time enablement, where, um, the reps are notified of something when it becomes relevant to them.

So for example, a new narrative. Giving it to them at the moment where they are ac- where the narrative becomes relevant, for example, in this, uh, in this, uh, uh, call example that I just gave you, right? After the call, maybe that narrative pops up and says, "Hey, did you know we have a new positioning about this?"

Mm-hmm. "Do you remember? You might want to watch this recording. Um, here are the key points about that." Um, so I'd say, I'd say that's where traditional enablement was not able to get to the point where the sales teams could actually consume it. We only absorb like 10% of what we hear, and so the best way to do it is, is when we're in the moment, and when we can find it useful, then we'll use it.

And the next time we get into that situation, we're gonna be able to use it again. So I'd say that's, that's probably the, the biggest break in, um, sales enablement. [00:11:00] 

Mehmet: Great. So things are changing, and we're talking about AI native enablement now, right? Mm-hmm. So how does AI change the role of enablement from just, you know, as you mentioned, because before we used to watch videos or maybe someone, you know, practicing a demo or, or, or something similar.

So how that, you know, becomes something more operational and on a high level, of course, what does that enablement stack actually looks like in the AI enablement world? 

Laura: Yeah. So yeah, two questions there. I think the first question was- Yeah ... around, um, uh, what does AI enable us to do, um, in this new sales enablement- 

Mehmet: Mm-hmm.

Right ... 

Laura: world. Um, the key difference that it, that it, I think unlocks is, um, that AI really enables enablement now to become a system versus just one function in the [00:12:00] organization. And even if we had organizations in the past that were, uh, many of them were working towards enabling the sales organization.

Like for example, we had product marketing doing something. We had, um, you know, maybe rev ops doing something. We had the sales leaders doing their own enablement. Even if in the past a lot of them were trying to do the same thing and, you know, help all the reps, today AI actually helps us group all of that context together and say, "Look, let's not do it in silos.

I've al- I have the context for what the leaders are trying to train this rep on specifically, and so let me grab what product marketing is doing and deliver it to the rep themselves." It can be tailored and it can be part of a system now that it has access to all of the content and is able to consume this data at, at just an incredible amount of scale.

Okay. So I think, um, it solves for the system itself. Um, and then the second is you asked, like what is the tech stack that is- 

Mehmet: Right ... [00:13:00] 

Laura: that is necessary for, for go-to-market ena- enablement? And here we're not talking, I think, about, um, sales. We're not talk- just talking about the enablement tech stack, but we're talking about the sales tech stack and what do go-to-market teams need.

Okay. So it's a complicated question for me because I think that, um, the main thing that you want is, uh, context and memory capture. Mm-hmm. If you have that capture, you can kind of do anything on top of it. You can deploy Cl- Claude on it. You can deploy, um, you know, all kinds of models and agents on top of it.

So I would say the capture of the information and the, the capture of the information on one side, and then the storage and the recall and the memory of that is really important. So n- the number one thing would be do you have a, do you have a call recorder that, that summarizes and is, uh, has the ability to extract the information for you?

Right? [00:14:00] Whether the call recorder itself has, has AI native, um, features itself is not, not as important as is it bringing the data in the right way that can then be used by the LLM. That's one. The second one is, um, around the CRM. Is your CRM able to get read and write updates from agents, or is it, uh, or is it closed box, right?

Do you, do, does the rep still have to go in and, you know, click those buttons? 

Mehmet: Mm-hmm. 

Laura: I- if, if that's the case, then, then it's, it's not going to be AI native. Okay, so CRM ca- uh, like call recording or, or sort of note takers, uh, type of thing, right? And then the third one- Yeah ... is the, um, is the engagement and the outreach.

Now, I think that that engagement and outreach can actually be built on top of your, on top of, you know, an existing sort of memory, um, and memory context. But is there, um, if you, if you don't wanna build your own and you wanna buy something, I would say is there something [00:15:00] that can, um, grab all of the signals, not just from your CRM, but also from intent, from enrichment, from, um, all of those things, and, uh, in- ingest your own salespeople's personas, their preferences, as well as the company narrative on all that, and help you craft those sequencing and messaging?

Now, that's a little bit harder to, I think, achieve, but I'd say those three things are probably the, the most important things for our sales stack today. 

Mehmet: I would call that the ultimate, uh, goal that we want to reach to- Yeah, exactly ... build our, our... Yeah. 

Laura: Yeah. 

Mehmet: So you talk, Laura, also about, um, the AI enablement flywheel, and- Mm

you know, wa- walk me through the components of that. Like, and, um, how do they connect? 

Laura: Yes. So it is, um, content, programs, analytics are the three, uh, are the three, um, [00:16:00] are, are the three pillars of the flywheel, and at the center we've got analytics and basically the orchestration, right? Um, so content is about, um, what is going into, uh, delivering this to our, uh- customers.

Mm-hmm. And the, the, the basic problem of content before was that it's always outdated, and we're always on the back foot because it's not responding to the market quick enough. Today, in an AI native world, for example, at DevRev, we've changed our messaging three or four times just in the last, you know, 12 months.

Um, and if we were going off of a quarterly schedule, for example, where we're giving, you know, the reps, "Hey, here's new, n- a new narrative," we would be too late. Okay. So content itself has to be refreshed and updated based on what the market and the customer and the rep needs in the moment itself. So there needs to be a way to, [00:17:00] um, to gather, gather all those things and also generate that content.

That's one. Go ahead. Did you have a question? 

Mehmet: No, no, no. Go ahead, go 

Laura: ahead. Yeah. Okay. And then the second part is the programs. The, so programs are around making sure that, um, the way that the content is delivered to the rep is not just single-threaded, and it's not just, "Hey, watch a recording." We kind of talked about this earlier, right?

It's not just a recording- Right ... just in time, but it's also that whatever you're doing, whether you're touching the rep in as a leadership conversation, um, in a company all hands, in a go-to-market all hands, in a sales enablement, uh, weekly meeting, that all of these things are connecting the dots on the content that you're delivering.

Okay. And again, AI can help us do that because we have access to all this information, and it can actually give us programmatic themes around here are the things that we need to be, that we need to be delivering, okay, and at what time. So that's the program. [00:18:00] And then the third one is, is analytics and how we measure that, and being able to change that really quickly.

So today, we don't even actually need, um, I mean, of course, we have lagging indicators and reports that come afterwards, but we don't have to do that at the end of the quarter and say, "Is the message working?" We can measure that straight away on a call. What did the customer say when you said this, when you delivered this- Mm-hmm

this new po- positioning? When you talked about the new pricing, were customers confused? Um, was it the way that you delivered it because, you know, you weren't, y- you didn't practice? Um, and those analytics being available and useful in the real time. And then the last part is, is really, you know, how does the...

how are you deploying it? And a lot of this is just agents. Agents and orchestrating, pulling all this data so that the rep doesn't have to do it, um, so that they don't have to go in. They don't have to update the CRM, right? If, if the, the AI should be asking the rep, "What do you want to do about this?" They should be [00:19:00] asking for their opinion rather than, um, the execution.

Mehmet: I think that's make a lots of sense because, you know, uh, I- I'm part from, uh, the, I'm part of a, um, WhatsApp group of founders and, you know- Mm ... someone was asking a question about CRM specifically. 

Laura: Uh-huh. 

Mehmet: And this, a- and you know, they were saying like, "Okay, what's the most important thing that would show me, um, if the mo- if the deal is moving forward," right?

And they said like, it depends on w- how do you define this, right? Like- 

Laura: Right ... 

Mehmet: how, you know, and based on what you're saying, like you need to have this multiple layers, uh, that you just put them, which we call them like the, the dots that, or the, the, the, you know, the component that you connect as dots between each others.

Now, if one of them fails, that might break the flywheel, I think, Elora, because, um, and in your opinion, which one is the weakest [00:20:00] one? Like, is it the people? Like, because you know, the, the, the reps are, are, are not getting it, or is it because of the data? You mentioned analytics. Or is it because the process itself?

And this is why I'm, I gave you the example of this WhatsApp group. Mm. Because if we don't know how to define how things are moving forward, we're gonna be lost. Like, whether we have like the best, uh, breed LLM and best in breed agentic framework for our, uh, uh, RevOps. Do you agree with me? Like, I mean, what's the most important thing here?

Or what, what breaks also this, um, this flywheel? 

Laura: Two things break. The first thing that breaks is we didn't have a process in place for how we were gonna manage the change in the first place. 

Mehmet: Mm-hmm. 

Laura: So for example, we didn't say, "Hey, what are the outcomes that we're trying to do, and how d- how are we gonna, like, actually insert this into the reps?"

Most of the time when, when we buy [00:21:00] technology, and not, not even AI technology, but most of the time when we buy technology, we don't have a good change management process in place. Mm-hmm. The implementation itself is failing. So this has actually nothing to do with AI. It's just more around, like, change management.

Do we have a good change management in place? If we didn't, if we didn't, and we still expect it, like we have this amazing AI that, you know, is agentic in nature, can do all the execution, but then we just ask it to, oh, summarize, and then make the rep copy the information and put it into the CRM. If we're asking them to do that, then the adoption is gonna fail, right?

Bec- because we didn't think about it end to end. So I think that's, that's one way where that adoption fails. The second one is where we did not take care of the data, and this is more specific to AI. Like we didn't take care of the data and how we're aggregating that, and how we are cleaning it, and how we are transforming it for the LLM to use.

And when we do that, [00:22:00] then the AI starts producing answers Data information that is wrong, and people have a very low tolerance for trust these days, right? 

Mehmet: True. 

Laura: So as soon as they say, "Well, it's not working, it's not working. It, it doesn't give me the right information," then they'll stop using 

Mehmet: it. True. 

Laura: Yeah.

Mehmet: Laura, you just mentioned something about the, the feedback, right? Mm. And now... A- and this is something, you know, really important because I, I, I've lived it also at, at, at some, uh, part of my career. Um, how we can, uh, like, kind of operationalize, if the word is right, you know, this feedback loop between sales, products, and, and enablement?

Because you... I've seen, like, time and time again, when we come up with a new messaging or, for example, with a new [00:23:00] pricing, or maybe we want to change probably the, the, the bus- not the business model, but I mean the way how actually we want to do business with these customers. I'm talking in the B2B space, right?

Um, so people have hypotheses. I can understand that. So they say, "Okay, let's say if we add this feature, this how we think, you know, customers will react. Or if we go with this pricing model instead of this pricing model," and of course you have seen a lot now, you know, especially SaaS companies going from seed to, um, you know, like, uh- Mm

actual performance-based, uh, licensing or outcome- Mm ... based licensing. Uh, sometimes it's the messaging itself. The company is known for X, now the company will say, "Hey, you know what? We need to position ourself actually as the AI enabled X," or something like this. But it doesn't work all the time, [00:24:00] right? And you said, you just mentioned about the feedback, so how, how this from operations perspective, especially for people who are in enablement, right, um, would look like?

Like, and, and what, what's the role of real time data in shaping, you know, all these, whether it's with, with training the reps or getting this feedback loop and changing if, uh, if needed with that messaging, pricing, whatever that is? 

Laura: Yeah. That's a really interesting question, and I think that, um, it depends on the kind of company that you're in and where your product, uh, where your product lives.

Like, is it- Do, are, are we a name brand company or are we, uh, introducing new things into the marketplace all the time? Like, for example, AI today is still relatively new, so some of these concepts we might actually be educating a lot of the, the, the buyers on. And in that case, there is a curve for adoption, right?

So, um, [00:25:00] uh, it may be that the message is not landing because the people that we're talking to are not yet there in their maturities. And, and, and so I think there is a, there is a spectrum first of where customers fall and how your message lands. Um, there are two components actually for f- for feedback loop that I tend to, to look at.

One of them is, um, around are we, are we in the process of educating? And is the customer... They may not agree, but are they, are they clarifying something? Are we... Sorry, are we clarifying something for the customer? If we are being able to clarify something and it is, um, it is not causing confusion, not, not in the way that it's being delivered.

I'm, I'm just talking about the, the messaging itself. If it makes sense but they just don't agree, okay, then it, then I think we are landing [00:26:00] on the education side of it, where we're saying, "Hey, this is a new thing in the market. You may not agree, but I am being clear on what the position of my company is and the product and the new feature, and I'm being clear."

And maybe we ne- we need to spend more time establishing the problem first before even talking about, you know, the feature itself. That's on one side of it. Sorry, did you, did you have a question? Okay. Go 

Mehmet: ahead. 

Laura: So then the next side of it was, the next side of it was, um, the rep level, uh, information. The rep level, uh, presentation of the messaging itself.

Mm-hmm. That is far more qualitative, but there actually are a lot of examples where the messaging might be absolutely clear. The rep is just not able to tell that story or they're not listening, which is a lot of what happens in the, in the conversation, okay? Um, so there are two things that I tend to [00:27:00] look at.

Are we clear, are we being clear? And is the rep delivering the message as I asked? Now, the second thing is actually easier because we can, we can look at, we can look at, um, things like, okay, did they pause? How much did they talk? Like, if the customer said this, did they respond with something totally different?

I think that AI can actually do a very good job of, um, in the moment telling the rep, "Hey, slow down. You didn't address this question. You didn't do this. Like, do this now." I think that's one. The, the first part of it is, is more difficult, and I think that's something that where, this is where human agency comes in, right?

And this is where we, we make an assessment, like Is it that the messaging is not landing, or are we not being clear, or is it that the problem itself is, is not established correctly? 

Mehmet: Right. Like, it's, it's, you know, a, a very great way of doing this deep analysis on, on, you know, figuring out what's working, what's not.

Now, I know you talk a lot [00:28:00] about metrics that actually matter, right? Mm. So, and you highlight behavioral signals as leading indicators of performance. If you want to mention, like, couple of them, you know, that you trust the most, what that would be. 

Laura: Yeah. Um, so this is a not, not AI, uh, not AI, uh, specific or related really, but I would say that the, the activity metrics that matter most to me are the inputs around, uh, the inputs around activity and top of the funnel p- pipeline creation.

And this is something that I've said in, in sort of all of my, um, you know, go-to-market interviews, which is basically that, hey, if your rep is excellent at creating pipeline, which also means they're excellent at qualifying, they're excellent at hunting for a no, they're excellent at establishing whether the problem is worth solving, right?

If they are really good at that, and they're good at finding people, that makes a great and excellent start to whether or not they're gonna be successful. Because nobody likes [00:29:00] c- creating pipeline. Nobody loves helping the rep create pipeline. Everybody loves helping a rep close deals. 

Mehmet: Right. 

Laura: Now, um, the second part of that question would be today, what kind of, what other kind of metrics would I use?

Um, at DevRev, we actually have a, um, a, a, a tr- internal, um, AI called Computer, um, and, uh, we can track actually how much they're using it and what they use it for. Um, that would be actually my metric also. Like, hey, are you actually leveraging the tools that you have? Are you asking those questions? Which skills are you deploying?

Um, which agents are you asking to do work for you? That would probably be, be my third thing that I would look at, um, today. 

Mehmet: So Laura, that means the traditional dashboards that we are all used to, um, in, in the, in the old world, I gonna call it- 

Laura: Yeah ... 

Mehmet: they need to be changed, right? Like, so, so I need, if I'm a CRO or, like, a head of sales in a, in a [00:30:00] scale-up or startup, I need to be looking to a completely different kind of, of dashboards in this AI native world.

Is that right? 

Laura: Correct. Um, I, I actually am of the opinion that dashboards 

Mehmet: are dead. 

Laura: And we don't need dashboards anymore because what, what do we use dashboards for? We use dashboards to identify trends and, uh, like are we, are we o- on, on target to getting to our goals, or is something going wrong? Do we need to do something?

So we, we use them to identify trends and get signals for us to do action. Okay. However, if we didn't have that, cross that out, and just have the agent basically scan for those signals and intents and, and give us, "Hey, something you might need to worry about. Um, uh, the churn rate is trending down over the last two weeks.

Uh, this customer looks like they are about to churn, and it's gonna impact your number by, you know, X percent. Go look at that." [00:31:00] Like, if we had, um, agents that did that for us, what would be the point of a dashboard? And any kind of, any kind of information that you wanna ask, you can ask the AI. Now, if you want something, for example, for a board deck created on the fly, the agent can actually do that for you, and it can give you the metrics that matter most now.

So for example, if this quarter the most important thing was how many new business meetings that we have, and that is the thing that you wanna talk to your board about because that was what you committed to driving, more top of funnel activity, then let's create a dashboard for that. And the agent can just, you know, create it on the fly for you and, and you can put it in your board deck.

Next quarter, if the priority was something different, we can also do that. Um, so, you know, hopefully we get to that point. I, I, I realize it's a way o- it's a ways off, okay? But, yeah. 

Mehmet: Um, I, I would call it like it's returning back to sanity, right? 

Laura: Yes. 

Mehmet: And the reason I, the reason I'm saying this [00:32:00] because if you really think about...

And I agree with you about the dashboards because they are, uh, uh... Yeah. So, so people they say it's, they are real time, they're not. You just are picking up whatever data you- Was last year ... yeah, they, they have logged the last time. And we know, like, you know, all the push from frontline managers, "Hey guys, go update the CRM.

Hygiene in the CRM," and all this. So I think this is going, like, to, to an obsolete phase because now with all what we just discussed, you know, so far about, you know, getting the real time data and getting, you know, the agent to actually, um, extract what's important and, and do this, so we don't really need this.

Now, people might think, oh, maybe they are pushing that we don't need a CRO. I'm not saying this, of course. Um, [00:33:00] because actually this is why we call, we tell people, like, everyone who works in the sale organization, they should be focusing, to your point, Laura, on how we can generate more pipeline, like, and how we can be in front of as much customers as possible instead of wasting time in front of dashboards, spreadsheets that you and me, and I think a lot of people would agree- You know, for years, you know, people, they had to survive this, I call it, quote, unquote, "torturing" of just discussing numbers.

And we know that it's just, you know, something that... And let me be, you know, very frank here. We know, like, some people make things up in the CRM, and these dashboards are made up of the, you know, made-up data that we have put in the CRM. So I think this is a kind of a wake-up call for people to go and actually do the things that matter.

[00:34:00] And this is where, you know, I want to ask you, um, so we- you talked about pipeline, right? So what's the full story here, like, and how do you think AI can help us or can help the reps in having this pipeline coverage, um, so they can attain what they have to do? 

Laura: Yeah. You mentioned something actually earlier, um, in your conversation about, um, um, it, it was about, uh, the reps not wanting to, uh, or spending more time, needing to spend more time in the field instead of, you know, behind a computer discussing numbers and all that, 

Mehmet: okay?

Right. 

Laura: Um, h- you know, a lot of, a, a lot of, I think, the best deals are won, and I'm talking more about enterprise B2B sales. I'm not talking about- Sure ... like, the transactional ones, right? A lot of the best deals are [00:35:00] won when the sales rep is spending a lot of time with their leaders, their ecosystem, discussing, "Hey, what do you think the customer needs right now?

And let's execute on that. Let's, um, let's do something." They, like, they, they spend time actually strategizing on the deal and, you know, less time spent, let's say, talking about the number. Um, and I think that this is where, um, AI can really help us. Um, instead of waiting for, "I need to talk to my manager. I need to talk to my sales leader to give me guidance on, you know, what I should be doing," they actually have that coaching right at their fingertips.

They can ask the AI to help them. So on one hand, I think, um, the deal strategy can be done with, um, a lot of the AI support, and you can have your AI trained on your leader's behavior, um, you know, how your deals are won or lost within the system and all that, all that good information. And I think it's gonna be very effective for reps to [00:36:00] accelerate those deals.

But that also means, I think, that, that actually, um, they are spending maybe more time conversing with the computer, right? Instead of, uh, in, e- except that instead of spending time updating fields, they're spending time actually talking about what those, what those points might be, what the new use cases are, what is the, the angle that we go in to the customer with.

Um, that would be one. And then the other thing, um, would be in terms of actually building that pipeline We became smarter, I think, in the last 10, 15 years. We became smarter in finding intent signals, um, you know, intent signals from, from customers. Uh, um, information about them, enriching context, and all those, those things.

We already did that before the rise of AI. But where AI, I think, makes it, um, easier and faster is that now AI can actually scan... I actually have an agent that scans this for me. It's called a scanner. It's called a low-hanging fruit scanner. Um, and what it does is it [00:37:00] actually goes into all of the publicly available information sources.

It finds all the signals based on, uh, you know, even, like, Reddit threads. Like, what are they talking about? What are the key problems? Mm-hmm. What, uh, j- what job postings are there on Link- LinkedIn, um, that people are hiring for? Uh, have this particular function decreased or increased in the last, you know, three, six months, right?

All of those things, it can now actually consume all of that information without a human having to go to each individual thing, click, click, click, click. Consume all the information, and it can surface it to me, um, straight away, and give me the right people with all the information that I need, all the phone numbers, email addresses, and here's the, here is the talking points for how you're gonna reach out to them.

Uh, do you want me to craft an email, or you're good with that? I think just that ability to, to do that faster and in a more effective way, um, so you're not cold outreaching. You're not, you're not going in blind. And we used to do this. We [00:38:00] used to do this quarterly. All the reps had to do it, do a territory tiering, right?

They had to spend, like, two days just going through that before their QBR, and they'll say, "These are my top accounts." They don't have to do that anymore. They can just do it on a daily basis or weekly basis. They can get a little signal, signal report. Here's five, five accounts. Go. Um, I think that makes them just more effective.

Mehmet: Yeah. It's like working, I would say, smarter, not harder- Smarter ... right? 

Laura: Yeah. Exactly. Exactly. 

Mehmet: Absolutely. Now, Laura, let's, let's say we decide to build a revenue organization from scratch today for a completely new company. Yeah. AI first. 

Laura: Yeah. 

Mehmet: How that organization would look completely different from how we build teams today?

Laura: Sure. Um, I used to have a team... So, so my rev ops org had, had four pillars. Okay. So one was, um, reporting and a- analytics. The other one [00:39:00] was operational, uh, uh, operational efficiencies, like deal desk, uh, you know, a business partner, an analyst for that region. The third one, the fourth one was, um, around, um, enablement, and then the last one was around systems.

Okay. So what I would do is I would combine one, two, and four into one function, and this function would be a go-to-market engineer, and I hate to, I hate to u- overuses this term. Um, but, but really who we're looking for are really smart people that are technology forward and have the skills to use AI, but also build, um, build, build tools, right?

They're not analysts anymore. I don't need a team of analysts ever because all that's needed is the ability to ask the right questions. These same people would work on making sure that the data that was going in was absolutely great, right? Um, and, uh, they, they would set up the systems, the [00:40:00] integrations, the, you know, the MCPs, uh, for all of those different systems that we're doing and combine it in a way that's gonna make it useful for the reps.

Um, where I think, uh, the role of, for example, e- enablement still has to exist, or the function enablement has to exist, is more around, um, that coaching element and identifying those big trends and creating the content for it. That requires a little bit more subject matter expertise, I think, around the go-to-market function, and so I keep that.

So I would say, um, out of those four functions, three are combined, so it's, uh, really a two-function organization. And the scale, the scale I think depends on how many sellers and how many systems you have and what, um, what your goals are 

Mehmet: Absolutely. You know, like I have, um, to agree with you on, on, on the thing you just mentioned.

And it's not my, it's not my own words. It's like one- Mm ... of my guests who brought it first, I think, you know, two weeks ago. He's my friend also as well, Rabi. Um, so he said, you know, in the age of AI, [00:41:00] you know, the, the perfect thing is to know how to speak plain any language. Like, if, if you're- Yeah ... dealing with it in English, to, to just express yourself in a plain language that is understandable for everyone.

So, so exactly- Mm ... to know what you want to do, like what you want to achieve. Now, to your point also as well regarding, you know, the coaching part, and I think we know where this is important because we need... And actually I record a, a, a, an episode yesterday and then, you know, my guest corrected me and he said, "Mehmet, you're talking about actually coaching and management."

Because I said, in the age of AI, what I discovered, because as you mentioned, we can do anything, right? So we have a lot of plenty of tools. There's no, um, you know, I think there's nothing that still we didn't try to implement the AI in it. But the, what I figured out from my own experience, like working with these tools, like agent frameworks or whether like the LLMs, is that we need to, to, to [00:42:00] have great, great, great level of attention to details.

Laura: Yes. 

Mehmet: And, and when I describe this, and, um, if we talk here about the GTM, so you need this manager view, coach view, whatever you want to call the title is, that he or she, they can go and say, "Hey, you know what? Great work, but you forgot to, for example, check on this and that. They are missing here." So this is where you just try to, you know, steer the, the outcome in a way that is optimal for everyone.

So 100% agree with you, Laura, on this. Now, as we are coming close to an end, you know, where people can find the book and where people can get in touch with you, Laura? 

Laura: Oh, thank you so much for asking. So here's the book. It's Designing for Excellence: Sales Enabled in the AI Native World. Um, it is available on Amazon, um, Barnes & Nobles, your favorite bookstore.

Um, it is available there. You can, you can a- a- absolutely just buy the digital version or the hard- or the paperback version. Um, and, uh, you can also listen to, uh, my [00:43:00] podcast as well, which is a- also on Spotify. It is the State of the AI Union. 

Mehmet: Great. So I will make sure that the link to the book and the link to your podcast are available in the show notes.

Um, Laura, I can't thank you enough for giving me the time. I know, like I said, busy, uh, time for you, and you gave me this one-hour time from, from, from your busy schedule, so really I appreciate that. And, um, you know, glad to have you made it to, to the show. And this is how I end my episodes. This is for the audience.

If you just discovered us by luck, thank you for passing by. Thank you for listening or watching. Just a small favor, just subscribe and share it with as much people as possible because we're trying to do an impact. We're trying to share knowledge. We're trying to get people to listen to awesome people like Laura today talking about how GTM and enablement looks in the AI native world.

And if you are one of the people who keeps coming again and [00:44:00] again to listen to the show or watch the show on YouTube, thank you for doing so. Something cannot happen without you, which is having The CTO Show in the Apple top 200 podcast charts all this year and last year actually we were having the same in multiple countries.

So I keep seeing we're changing countries every week, but always we are somewhere. So this cannot happen by itself. That means people are recommending the show to others, and I really appreciate it. So as I say always, stay tuned for a new episode very soon. Thank you. Bye-bye. 

Laura: Thank you.