That objection? It’s a good thing – with Pavel Dmitriev

In this episode of RevOps Therapy Pavel Dmitriev, VP of Data Science at Outreach, shares data showing XDRs who quickly reply to objections are the most successful.
Listen on Apple Podcasts
Listen on Google Podcasts
Listen on Spotify

Show notes

What if we told you that it’s a good thing for your XDR to receive a negative response? Not an unsubscribe, but a negative response.

Most XDRs chase the positive response, quickly getting back to anyone interested in a meeting. But… the data shows that XDRs who quickly reply to objections are actually the most successful.

Don’t take our word for it; you can ask Pavel Dmitriev, VP of Data Science at Outreach, who spoke with RevOps Therapist Jordan Greaser, CEO of Greaser Consulting.

Jordan  00:00

Hi all this is Jordan, the owner and CEO of Greaser Consulting. On this call, we’ve got Pavel, the VP of Data Science from Outreach. And it was fascinating. So many things in sales feel like they’re subjective. One person says it’s this way, and another person says it’s that way. And I had an opportunity to talk to somebody who has spent the last four years with a team of data sciences, crunching the numbers on things like call data, email, data, outreach plays, different tasks that moves things to meetings, or through meetings. And he’s able to shed some light in areas that maybe it’s not as subjective as we thought it was. You’re gonna get some fascinating, maybe not statistics specifically, but just outcomes of what the data has been saying. For example, one of his big takeaways is that any response is a good response unless it’s an unsubscribe. So getting somebody to write back to you, send you an objection, actually shows interest. And so once you get that in your frame of mind, you recognize that the best reps aren’t the ones that just spent a lot of time writing an email; they wait for that positive response, and then they’re off to the races. They’re the ones that any response that they get in their inbox, whether positive, negative, neutral, whatever, so long as it’s not an unsubscribe, they’re the ones that are actually winning the day. And so it’s not reps that write really good first emails that tend to be getting the best results. It’s the ones that handle objections really well. And anyway, this is just one of the key points that we’re pulling out of this call today. So I really look forward to you diving in and listening to him. Fascinating guy with some fascinating findings. Enjoy!

Intro Jingle

Say you want some clarity in sales and marketing and SEP? Well, we have just the remedy: our podcast, RevOps Therapy. Yeah.

Jordan  02:11

Hi, everyone, this is Jordan. And we’ve got the VP of Data Science from Outreach with us today. Why don’t you go ahead and introduce yourself? Tell us what we need to know. And we’ll jump into it today.

Pavel  02:24

Hello, everybody. My name is Pavel Dmitriev. I’m a Vice President of Data Science at Outreach. I’ve been there for just over four years. So I’ve seen it grow from a small company to a pretty big company now. And sort of seeing all those different ways, it’s a lot of trial and error of how to use AI and machine learning to improve sales.

Jordan  02:56

Well, it’s funny, you say, you know, it’s a very different place at the beginning. I think you joined Outreach just when I was sort of leaving. And we were at an interesting spot where we weren’t quite this established company. We didn’t know if we were going to make it. So I got to know what makes a data scientist decide that it would be a really good idea to join a startup sales company, because usually, you don’t think of salespeople and data scientists as like, oh, this makes a ton of sense to go and let’s work together. So what was the initial sort of draw to this company?

Pavel  03:38

Yeah, to be honest. When Manny, our CEO, messaged me on LinkedIn, my first sort of, my first response to it was like, “I’m not interested. I don’t want to anything to do with those spammers who send me all this unwanted email,” then just, you know, I don’t just just leave things to intuition. So I dig in a little bit into just a little bit of research on it. And what I found actually was very interesting. First, I found okay, this is not about spammers, actually, sales is so critical to the economy and to like propagating new ideas and helping and entrepreneurs succeed. In fact, I found I, myself, do so much of sales type of work. And it’s actually very painful when I have to like email people and message people when I like, whenever I have to bring them together for various reasons. And then I also thought that this is actually a very unique opportunity, especially at that, at that time, where AI and machine learning in sales was really kind of pretty much non, non existence. And the reason for this was that no one really in the past was able to put together all of the data throughout the whole lifecycle. So everyone had pieces of data, like CRMs would have some metadata and column tools would have cold data, but no one had all of the data. And Outreach was actually doing it. It’s bringing together all of the data. And so that’s one very important thing for a data scientist to have complete data, as complete as possible. And the second thing that I liked is that success criteria in sales are so clear. Like at the end of the day, you either win a deal, or you lose a deal; there is no gray area; it’s black or white, which is really not true in most of the applications in data science. Now, for example, I used to work in Bing and Microsoft search engine, and people come to be in: can they click around? Then good luck knowing did they actually find what they wanted or not, like, that’s the whole area of research; the success criteria, not clear. And the data is also incomplete, like people click on a website, and then you don’t know what they did after that. So that’s sort of the world of most data science problems, but in sales I thought, “Wow, this is, this is perfect. We just need to like really take that data, take our objective functions and optimize things.” So that’s why I joined.

Jordan  06:35

Well, that’s, I think, really interesting for you to get to the end conclusion that with sales, you actually have an objective outcome: you close the deal; you didn’t close the deal. So then you can start to measure to that. And the reason I think that’s interesting is usually when you think about sales, in general, you’ve heard the phrase that it’s half sales and half art, right? And that the top salespeople, they tend to think that it’s their art that makes them so good, right? Like it’s my ability to work the room or to have the conversation. And so sometimes when you get into the field of sales, you have this impression that it’s so subjective. Well, you like this, but it’s just a subjective thing. He likes that. And it’s a subjective thing. But actually, it’s not subjective: you win the deal; you don’t win the deal. So there’s a lot of actions in between there that you can measure in an objective way to figure out where does it move to close or not? So maybe it’s not as it’s like art related as you think. So like, we’ll come back to that in a second. But you walk into Outreach on day one, and you start to bump into, you know, these living, breathing salespeople. And like, I have to think, like, culturally, as it relates to data scientists in a sales world, there’s just a lot to figure out there. Like, did you have any thoughts? Like, these are crazy people? Or were you immediately like, this is gonna be a lot of fun?

Pavel  08:11

Yeah, I think it was the latter, it sort of was a culture shock, but in the positive direction. Because if you think of like data scientists, engineers, they all sit the whole day in front of computer; we don’t even talk to each other that much. And especially coming also from bigger companies, the level of energy is sort of low; you’re sort of very much in your head, self-motivated. And then you come to the sales area, and there is so much energy; everyone is like bustling with enthusiasm. There is so much confidence that salespeople have. And like when I went to the sales conference, I was also shocked because all the keynotes are about like mental toughness, overcoming obstacles. And this is like, “Man, this is what I need. I have my science knowledge, I think I really need to learn that to like progress and grow in my career, and frankly, in my personal life, too.” So I was super excited when I got exposed to the sales culture.

Jordan  09:18

I think that makes you a little bit unique, though, that you’re, you know, so steeped in the science and the data of things, but at the same time, there’s this sort of qualitative nature that, like, is meaningful. And you know, that you tap into, so much so, I was sort of joking with you about this, that you’re, you’re what considered the great guru of yoga around Outreach today, like you’re sort of you’ve really leaned into thinking about things a little differently. Am I off on that?

Pavel  09:53

I don’t know. I don’t know when I talk to other data scientists on my team, as they all find it very complementary; you know, there is actually a principle in yoga since we are talking about that is that the opposite, the opposite values are complementary. Yeah, like, that is high and low; if there was no low, I would have said it would be flat; there wouldn’t be no high; high would not have any meaning. So kind of that combination of deeply thinking about science. And on the other hand, people who I just go with their character, and their feeling, like that combination can complete the whole. And I think if you find, and that’s what I feel like the application of data science and sales; it’s not to replace salespeople use robots, is that we find the way to sort of bringing these two pieces together; we can fully utilize the benefits of science, and also the uniqueness of every salesperson and how they do their work.

Jordan  11:05

So what was the first maybe first one or two, like big things, when you started to dive into the data, that when you would talk to salespeople, the expectation was, like, A was going to happen? But after you looked at the data, you’re like, “No, like B is what occurs, not A.” Were there any like early aha moments that sort of got the ball rolling?

Pavel  11:31

Yes, there was a very interesting study that we did in the very beginning, where we look at the productivity of different salespeople. And then we also looked at, at the results of different salespeople, and then we looked at what they actually did, and whether they followed the playbooks, the instructions that, you know, their enablement team, their managers, tells them to follow. And we’ve found the very surprising things that top performers did not actually quite follow the instructions. And the specific, specific situation was there is how quickly do you jump on objections? Yeah, that everyone knows that if you get a positive reply from a prospect, you like jump on it right away. And the sooner you kind of talk to them, the better it is, like, that that’s known. But what about the objections? And what we found is that people who like jump on objections with the same urgency, as on positive replies, actually, that’s what all the top performers do. But specifically in that situation that we started, that was not the instruction; the instruction was to prioritize the positive and no objections, you can respond some time, but there was no no need to rush. And that turned out to not be true. And that really just was one example. And then later, we found many others where the conventional wisdom of what’s best to do in sales was actually not true.

Jordan  13:17

That was Glassdoor, wasn’t it? I think that was the first study, right? Like you, you went in you were looking at objections.

Pavel  13:23

Yeah, that was a study we did with Glassdoor.

Jordan  13:26

So that says this is kind of interesting that right, I mean, you’re coming from Outreach, which is in a lot of ways, like it’s systematizes the playbook. I know this is outside of that, though, because this is responses; you automatically are using something like Outreach to get a response. But in some ways, you’re also saying, well, the systemized playbook wasn’t actually working. And we found it over here. So was there anything on your mind as you’re thinking about, like how Outreach actually sort of enforces a playbook? How we need to be able to draw data from that like, really quickly? So the playbook actually works? Like was that part of the thought process?

Pavel  14:09

Yeah, I think it’s sometimes it is, it is a process. I feel that sales industry as an industry is going through, which is sort of the similar process that all the other industries that undergone this AI and machine learning transformation have went through, which I think of consistent of three steps. There is a first; there is a standardization step where we have to, like we talked, kind of get all of these different practices that different people are doing and sort of standardize on something. And that’s what Outreach does; it captures it in sequences. And that really allows both kinds of predictability, but also it allows us to quickly propagate best practices like one salesperson discovers this is a great way to write an email; you just change the template and everyone is using it immediately. So that sort of standardization is the first step before kind of starting, starting to apply the data science. Then I think that second step is automation, which usually follows that’s, again, kind of Outreach in the early days; that’s what they did, you get the sequences, and now half of it is automatically executed, you only need to do the manual half, and everything has been tracked, and it’s very nice. And then the third step that goes on top of automation is optimization. So that’s, that’s where we start. And after your question, looking at this different processes, different sequences, and try to see is it really, is it really is the best, the best way? And then to do that, we first have to measure success. That’s actually one of the challenges that I found at Outreach, that there are certain things about Outreach and sales, in general, that are pretty hard to measure. And we have to figure out how to measure them if we want to improve them. For example, you know emails, everyone sends so many emails, but how, how would you evaluate whether my email template actually is working or not? And the only metrics that we actually had in the beginning, are this open counts, reply counts and click counts, click rates, and open rates. But those when we looked at it, we found that none of them are actually very good. Reply rate is probably the best. But we ran… this was another kind of surprising study that we did. And we ran an experiment. And we found that when we send more aggressive emails, we get more replies. And the more aggressive they get, the more replies are common, with the only exception that those are not great replies. Those are all “unsubscribe” and “don’t contact me anymore.” There’s science, when you do science, we don’t like to personally examine every message, we need an automated way, when you look at the metric reply rate is awesome going through the roof. So like figuring out those good metrics is the first challenge. And once the metrics are there, then we can apply many, many methods. For example, A/B testing is what we use specifically to improve sequences and improve email templates.

Jordan  17:40

So you change from reply rate. Now it’s based on sentiment, or it’s based on meetings booked, like what’s the true north? Just close the loop on that topic?

Pavel  17:51

Yeah, yeah. So what we found is that to be new, the new metric for emails, because still, you know, 70%, of sales, communication is emails. And if you don’t know how to measure it, I don’t know what we can improve. So what we came up with is a machine learning model, which evaluates replies, and labels them as it’s a positive reply, or it’s an objection, or maybe it’s a referral, and even a lot more fine grain of what kind of objection? Or what kind of referral? What kind of positive response? And based on and based on that, we can have a lot better, a lot better metrics. So our dashboards in Outreach, they have this positive replies, they have also unsubscribes. We have not yet sort of merged all of it into one number. This is sort of the debate between: Do we want a blackbox metric, which actually doesn’t mean anything, it’s combination of seven different things that mean something, or you just want to see clearly the components? And we said, we just show people components, people are smart, they can make the decisions. But for our own insight on machine learning algorithms, we actually use all of those different features at the same time.

Jordan  19:10

That’s the argument against indices in general, right? I mean, you can you find all these indices that try to put different data points together, and then you add it up, it creates a score, but then the question becomes, what does that score even mean? Right? And like, and what is the major components that actually drive that score? So I think I’ve heard like I’m doing, I do some social science things from here and research from here and there. And that’s what gets really hard as you get a lot of qualitative data that you try to convert to numbers. And then when you convert it to numbers, what does this even mean? And then you create an index, which again, like may or may not even have value. So I’m hearing what you’re saying, right?

Pavel  19:58

Yeah, yeah. And there are situations where it makes sense. But I’m sort of coming back to what we were just discussing recently that we are not trying to replace salespeople with machines and some automated metrics. We’re really trying to make salespeople smarter, by showing them this combinations is different metrics, that all makes sense. So they’re, they understand better what is happening, and they can make better decisions. That’s why we did not go as as the index route. In this case.

Jordan  20:29

How do you like? How do you mirror the idea or put the idea together that, well, we’re going to do machine learning to figure out what is a positive or a negative response? Right? Because in you, you sort of associate positive with good, you know, negative with bad. However, on that first Glassdoor study that you did, you found out the best reps are the ones that respond to positive or negative responses at the same speed. So in some ways, just getting a response is actually half of the battle. And then how you handle the objection becomes really important. So you know, as you’re thinking about your models, and you’re putting some of this together, is a negative response, so far as it’s not an unsubscribe? Is that actually a bad saying, looking through the lens of that first study that you did?

Pavel  21:24

We did, we found that it’s actually a good thing. Essentially, almost any response, unless it’s an unsubscribe, is a good thing. And when we were debating, who knows that index metric, we actually came up with some proposals, and we discussed them, and we assign the certain numeric values, a positive response would be a three. And then a negative response could be like a one or something like this, still a positive score. Because that’s what salespeople are good at turning around the objections. And any response really indicates some kind of interest. Like I personally do not respond to most of the sales emails. If I respond to some, even if it’s negative, it means that somehow it got me interested. So there is, there is certainly value in objection responses.

Jordan  22:19

So how do you? How do you convince a salesperson that that’s the case, and what I mean by that is, like, my company will write content for folks, will do some different things for folks. And obviously, the higher the positive sentiment, the better things are. But even getting the idea across that, hey, look, even if you get a response that’s quote, unquote, negative, like, get in there, that’s half the battle, like we’ve actually got you further along than you thought. And the reason I say how do you convince them is, you know, salespeople are notoriously stubborn. I’m positive. I’m not saying anything about, you know, smarts or intelligence here, I’m just saying, you know, working with these personalities, if you come to them, and you say, well, the probability of a type one error is less than 5% on this data that I’m presenting to you, therefore, you know, negative responses are actually good responses. I can see a whole room of salespeople saying “I don’t buy it.” Right? So once you get some of these findings, are you also involved in that process of okay, how do we convince the salespeople this is true? Or do you like you find the finding you wash your hands? And you say, alright, marketing or alright, enablement, you just take care of this for me. I got you the science; you figured out the training.

Pavel  23:46

Yes, certainly, certainly, I’m very involved. And that’s actually one of the most exciting parts of my job and the big perks; at Outreach we have customers in-house, our own sales team at Outreach is our customer. And I can talk to them any time. And we have a lot of back and forth on many of the features we developed, for example, know in answer to your question specifically for this, how do we convince people that negative responses are good? We actually have a report in Outreach which shows the performance of all the different salespeople in the team. So we don’t, and then different metrics, including how quickly they respond to positive replies and how quickly they respond to negative replies. And we found that we really do not need to do any convincing, like salespeople don’t kind of look at the table and see who are the high performers and what they are doing. And they will see that the high performers are actually replying very quickly to negative responses. So they, I think, if we can If that’s kind of the power of those good metrics that really go to the core of what’s important that if we come up with those metrics, and we just display them to people, everyone can make their own conclusions, derive conclusions out of it. It’s not, it’s become sort of data speaks for itself, rather than a persuasion type of job.

Jordan  25:24

But your point, though, how you present the data, does also craft a narrative. You know, the big, my wife’s an accountant, and the big joke that her and I talk about is, she says about, she says, “As long as the spreadsheet says zero over here, in terms of expenses, and bills, and whatever, as long as this says zero, like, nobody cares about what the rest of the data on the inside says,” right? All I get from this is the narrative that oh, okay, I’ve paid my bills, and I’m set. And all this on the back end, I don’t even worry about but you know, as you think about it; there’s a lot of different ways that you can pull data, there’s a lot of different things that you can sum, you can percent, you can average, you can move over. And so, while you’re saying on the one hand, well, the data speaks for itself. There is a little bit of crafting, though, right of like, even to your point of we should use an index or no, we shouldn’t. How do we present the data? Because we’re certainly going to lead people to a narrative, right? 

Pavel  26:27

Correct, correct. Absolutely. Yeah, that’s where all the back and forth with our sales team and the customers who we talked with, before we released even a simple feature like a dashboard, because we’ve very carefully chosen what we put in the dashboard so that we don’t introduce noise. And we really put the things that boss, salespeople say it’s important to them. And we also internally validated that they actually matter, that there is like a positive correlation between reply and faster and two negative responses and making booking more meetings and closing, closing more deals. So we had, we do the sort of a double the validation on the data side, and then validation on the people side to make sure it’s meaningful. And it’s interesting, and they’re willing to act on it. And when they act on it, actually, they see themselves positive changes happening for them.

Jordan  27:33

So I think we have time for one more good question here. We talked about when you first got there, and you walked in, and some of the different changes that like behavioral changes that came from your data? Like where do you see machine learning and AI? Like, what’s the next frontier of things that, you know, are about to be conquered in the whole sales world related to digesting a bunch of this data?

Pavel  28:03

Yeah, I feel like so far, we have done using the Salesforce app pretty good, within the sales domain, a pretty good job on automating some of the mundane tasks. So that frees up people time; we also did done a pretty good job on kind of uncovering the black box and display and what’s going on there, getting out good metrics and showing it to people in a way that they can understand. I think where we are going next, and what we have not yet quite succeeded in is providing good proactive recommendations. So sort of a lot of the data science is a little bit passive, right now in the sales area will tell you about the topics and the sentiments and what’s going good and what’s going not good. But we are not telling you what to do to improve. You’re sort of relying on people to make that decision. And that’s a much harder problem. And that’s where also a lot of individuality of different companies and different sales processes come in. And that’s why it’s hard. It’s not just kind of one size fits all type of an approach. But that’s certainly where we are going have been able to really provide proactive recommendations. If you have a sequence. We don’t just let you run an ad test and try ideas, but we’ll tell you to run an A/B test, you should run or maybe even automatically run some A/B tests on that sequence if you allow us and keep improving it for you. So that sort of proactive recommendation and optimization is I feel where we are going in the near future.

Jordan  29:56

So when when you talk about A/B tests, you’re talking about the potential, I just wanna make sure I didn’t hear you wrong. Somebody can flip a switch and say, “Hey, A/B test step two of this template,” and the machine will change a couple of the words on its own, say this one wins, it’ll start another thread, and it’ll change some of the template on its own, and just start testing and iterating over and over and over again and picking winners, is that what you’re saying is potentially possible? Or am I hearing that the wrong way?

Pavel  30:28

Yes, that and even more, I would say, suppose you have a sequence, you just click a checkbox on the sequence and say, you know, automatically improve it for me. And then what we could do in the back end, is not just like A/B test one in a certain step, but we could think, “Okay, what, like, first, do we have the right steps in the first place? Do you have too many steps or too few steps is a true manual, auto automated. And we, I will send you our emails at the right time of day. And we are waiting for the right time between this email and the next action that needs to be taken?” So all of this attributes, vehicle to optimize automatically, because there is business standardization, again, that Outreach provides; there is so much data across so many different industries on what are the different approach has been used: what’s effective, what’s not effective? And all of that could be utilized, of course, in a kind of privacy compliant, compliant manner, to instead of relying on sort of word of mouth anecdotal best practices to find that real best practices, which are based on, on hard data. And then try all of this because you never know even if something is best practice; your company may be very unique, and that best practice may not apply. So we, we can use that best practices to generate hypotheses, and then test those hypotheses just within every sequence for every specific company. And automatically keep improving the processes over time.

Jordan  32:18

I feel like I could sit here and talk with you for another like three or four hours talking about some of this sales data and what we can measure and where it’s going. But unfortunately, we, we’ve hit the time for today. So with that said, hopefully, we’ll have you back someday. And Pavel, thank thank you for, thank you for coming and joining us today.

Pavel  32:45

And you’re welcome. It was great talking with you.

Jordan  32:48

All right. Thank you to the listeners for tuning in. We hope you enjoyed this episode, and we’ll see you next time.

32:55
Hot dog. That was a great episode. Thanks for listening. If you want to learn more about Greaser Consulting or any information you heard on today’s episode, visit us online at www.greaserconsulting.com. Be sure to click the Follow button and the bell icon to be notified on the latest here at RevOps Therapy. Thanks and see you real soon.

Share with your network