Mistakes B2B marketers make with data, with Neil Hoyne, author of Converted: The Data-Driven Way to Win Customers’ Hearts

Neil Hoyne is the Chief Measurement Strategist at Google, a popular LinkedIn thought leader and author of Converted: The Data-Driven Way to Win Customers’ Hearts. 

He wrote his book when he realised that marketers can better learn from others’ mistakes than successes, to see what can be changed within B2B tech organisations when data isn’t being used to its full potential. 

On this FINITE Podcast, Neil shares his tactical tips to getting more from data, and using it to fuel marketing and business decisions to grow and organisation.

This episode covers:

Listen to the full episode here:

 

And check out more of the FINITE B2B marketing podcast here

Full Transcript:

Alex: (00:06)

Hey everyone and welcome back to the FINITE Podcast. I’m excited for today’s episode because we’re talking with Neil Hoyne. You might have seen Neil’s posts on LinkedIn, you might know him as the chief measurement strategist at Google, but most likely you’ll have seen or read his book, Converted: The data driven way to win customer’s hearts. And today we’re talking about some of Neil’s key insights from the last section of his book to learn mistakes that B2B marketers make with data. Let’s dive in. 

 

Before we continue with the episode, I’d like to give a quick shout out to our partner Terminus. The only account-based engagement platform built to deliver more pipeline and revenue through multichannel account-based marketing. As the only native multichannel marketing platform, Terminus helps you convert target accounts, orchestrate campaigns using personalised advertising, email signatures, and chat bots. Visit terminus.com to learn why doing effective ABM at scale means better marketing.

 

Alex: (00:56)

Hello Neil, and welcome to the FINITE podcast. Thank you for joining me.

 

Neil: (01:02)

Hey, thanks for having me.

 

Alex: (01:03)

Looking forward to talking. We’re gonna be talking all about the mistakes, hopefully some of the good things that B2B marketers do as well, but some of the mistakes that B2B marketers make when it comes to data. It’s gonna be a good one. I think we’ve got some great questions and you’ve got a lot of experience in the area. Before we do that, why don’t you tell us a bit about that experience and your background and experience career wise up until this point.

 

Neils’ background in B2B marketing 

 

Neil: (01:28)

I’ll give you the unofficial biography. I’ve probably spent enough time in digital marketing where it starts becoming a liability. Companies start asking, have you done anything else besides digital marketing? 

 

And that’s really been my home. Past almost 12 years at Google working entirely in that space where we have several hundred measurement products, but just as importantly, understanding how people actually use them. And so it’s not so much an issue of can we get more data and more reports and more dashboards, but what are those conversations that companies are having, especially in the B2B space, where they see the same data and they come to completely different conclusions or just don’t act down the data at all. So we’ll talk a little bit about that. 

 

Outside of that, I’m also a senior fellow at the Wharton school, the university of Pennsylvania, in their area of customer analytics, which is a lot of fun intersecting the practical research side, along with the teaching. And then I got the book thing that came out too.

 

Why Neil wrote Converted: The Data-Driven Way to Win Customers’ Hearts 

 

Alex: (02:22)

Let’s talk about that. Why did you write the book? The book is called ‘the data driven way to win customer’s hearts.’ Tell us a bit more about it.

 

Neil: (02:29)

There were a lot of motivations, but I think you touched on it in the introduction. One of the reasons behind it was that I was meeting with a professor over at Harvard business school. 

 

And one of his criticisms of business in general was that there were a lot of case studies that talk about where companies get something right. And he almost felt that there was a bit of a survival bias to say that all we see at conferences and out of companies are look at all these companies that embrace technology and data and all the money they made, but we know it’s not that easy. We just don’t talk about those. 

 

And I think often that separation of not knowing exactly where companies make mistakes and what could be learned from those lessons, I think a lot of companies go in and think data should be easy, it should be straightforward. 

 

But when they hit those inevitable roadblocks, they almost feel at that point that they’re an outlier when really just the infinitely successful case studies are the outliers. They feel like, hey, it worked great for everybody else, why not me?

 

And so I think part of the book was really structured. It started by just writing a whole bunch of these stories about where companies fail, but you can only take that for so much. And afterwards, unless it’s Friday, you’re like I need a drink. And it started when people started asking me, out of the stories of failure, what did you learn? What themes emerged? What opportunities, how would you do things differently? And what you realise is not only is it a great book around data and marketing, at least I think so, but it also approaches it from that pragmatic side. 

 

Instead of saying, this is what every company should do. It’s saying, let’s start with the mistakes that companies are making today and what we can learn from them. And then what would we do differently because of it. And a lot of people who have read the book, come back and tell me, they’re like, this is exactly what happens in my company. 

 

Even though the stories in the book were anonymised carefully, I have received more than a couple phone calls or someone called me up. And this was most recently a CMO called me up and he’s like, Neil, I know you anonymised all the companies, but like chapter 11, 12, where you’re talking about this one story, that was our company, wasn’t it? And I’m thinking that I’m like, I can’t tell you. And he’s like, I knew it. I knew it. I told the board this is what we were doing wrong and they refused to listen to me and he is going off. 

 

And I’m sure in a moment of therapy form and in the back of my head, I’m thinking this story wasn’t about your company, but I’m glad you can find the connection to it. And I’m glad you can see the path forward and the validation for it that it does in fact happen. And that’s really the whole context of the book, to almost provide that other side perspective you don’t hear about, but then also give you some practical guidance going forward.

 

Why starting small with data is the best way forward 

 

Alex: (04:57)

Very nice. I think it’s a great reference point for anyone, wherever they are in the journey into the world of data. I know that one of your beliefs is that starting small with data is very much the way forward. Maybe you can talk a little bit from a B2B marketing perspective, about why you think that is and how you’ve seen people get caught out by that.

 

Neil: (05:19)

I just think starting small is a reaction that’s necessary given the direction that the market has gone for far too long. And I make the joke in the reference that it’s very similar to people in January, especially here in the states who are like, I need to make a new year’s resolution. They’re like I need to get in better shape. And they make a large commitment to that. They join a gym, they buy some expensive home gym equipment. 

 

And in their case, they have some sign of tangible progress. Like I committed to this. But then you really see that they don’t necessarily get to what their end goals are. So if you sign up for a gym membership in January, the most recent data I source is that over the course of the next 365 days, you will go to that gym approximately six times. Four of those times will be in January. And then you’ll never be seen again, despite all the money you invested in it. 

 

And you see parallels there with enterprise software, where when they talk to executives and they say that new CRM implementation you put in, and you see executives say, 90, 95% of ’em say that they never actually reached the objectives that they had. 

 

But one of the lessons that we try to start with on it is that even though the enterprise software is tempting and it shows that obvious and expensive sign of progress, most companies would be better off if they just did something today that created value, that proved the capabilities to the rest of the organisation, that gave them confidence that they can do something very similar to just saying, look, don’t buy the expensive gym membership, just get on some running shoes and go outside for a jog. It’s not sexy. It’s not glamorous, but it gets you closer to your goal than trying to make a big investment. 

 

And most companies see that progress and they appreciate that progress. They just need to be given permission to say, we’re a large enterprise and we can do this. You don’t need the enterprise software package. That’s the beautiful sale that they try. Just do something, go out for that jog.

 

Alex: (07:27)

I think that’s great advice. And I think that applies to the whole world of MarTech in lots of ways, not just data and analytics, but I don’t know, technology.

 

Neil: (07:35)

And it’s a great sell and I see why they do it. You have a problem. We have a solution. You’re a large company. You need enterprise software. I get that. But I also look at the data and say most companies aren’t successful with it. 

 

And they’re not successful with it just because anything at an enterprise level is difficult. And sometimes it’s just easier, especially given the current economic circumstances, to say we need to be able to demonstrate growth, then really we need to demonstrate our ability to onboard and integrate new software.

 

What happens when you spend too much time planning with data

 

Alex: (08:03)

I was gonna ask actually, do you think that being a larger enterprise, when it comes to data, I guess any big technology project, it’s harder, right? It must be just the number of, we’ll talk about it in a moment, about stakeholders and internal politics and teams, but complexity just grows the bigger you are. 

 

If you compare yourself to a five person startup that needs to build some analytics into the way they operate from the day one, that’s pretty straightforward. Whereas if you are a global…

 

Neil: (08:32)

You’re being polite, it’s an absolute disaster. And here’s a perspective that I like to bring here. 

 

A lot of companies, when you see the evolution from a startup environment to a large corporation, they move from this very growth driven entrepreneurial side, but also one that’s very chaotic to one that is driven primarily by process, which is necessary to keep order in that company. That’s several thousand people, large hundreds of thousands of people and processes are in place.

 

The most successful companies are the ones that are aware that there are benefits to that chaotic environment. And it doesn’t necessarily come from spouting off platitudes about how we have to be more like a startup, or we have to move faster. It’s simply from leaders recognising that and managing that tension to say, not everything has to follow a process and that we can break those processes. 

 

And sometimes we shouldn’t be discouraged from doing so. Now, oftentimes the process people, and they exist in every organisation and I love them, they need to be there, will come down hard and say, this isn’t how we buy software. This isn’t how we make decisions. This isn’t how we do budgeting.

 

 And there needs to be leaders that say, yeah, we’re still going to do it anyways, because the process will take too long. And sometimes it’s worth knocking down some of those barriers. The other thing that I bring in touch, and oftentimes people get discouraged and they say, I’m just gonna go move and work for a startup. 

 

There are advantages to having the people in the capital in place. Startups right now are desperate for profitability, and have to make hard decisions about where they’re going to grow well funded established companies that don’t have to make those trade offs. 

 

But what companies need to realise is that these conversations around how to build new processes in their organisation are likely not unique to them. In fact, every company that they’re competing with is having the same struggles. These are endemic to the company’s size, not necessarily the industry or the product. 

 

And once they recognise that, you really say, look who you’re competing against here is you’re trying to be better than your competitors who are trying to figure out how they too use data to make some money. And so those problems that you’re having, where you’re sitting there being like, well, how do we get through this process? Or how do we implement the software? How do we use machine learning to realise that your competitors are doing it. 

 

But the mistake most companies make is that they spend so much time planning the solution, not executing. And I compare it very similarly. This is the analogy that I use. And it’s not mine. 

 

So you may have heard it before, but it’s that one of a bear chasing you through a forest right now. That’s a daunting proposition because human beings, we can’t outrun bears. And so what a lot of enterprise companies do is they sit there and they say, a bear is chasing us, it’s going to hurt us. 

 

We can’t run faster than a bear so let’s figure out the strategy by which we can outrun that bear. And the most successful companies are ones that just frame the problems differently. They say, look, we can’t outrun the bear at all. We’re outrunning other campers, and the easiest solution to do that while they’re all sitting there thinking, we can’t outrun the bears, how about we just start running? It’s impractical, it’s messy, but considering the alternatives you have, knowing you can’t hit that speed is just to say, it’s the best option you have. 

 

And so for a lot of companies, maybe that’s that lesson is to say that we wanna reduce risk, we wanna be successful, but there’s a point where you have to say that you cannot plan your way through this. Sometimes the easiest path to success is just to start doing something, knowing that that tension and those risks will be well worth it.

 

Alex: (12:10)

Great advice. The finite community is supported by Clarity, the fast growing global marketing communications agency working with leading technology brands. We are living through an unprecedented era of change, driven by advancements in technology. Technology that has the power to be an impetus for good and that will drive us towards a healthier, more prosperous, sustainable, and equitable future. Clarity exists to tell the stories of these companies blending the science of data with the art of storytelling, to enact measurable marketing and communications campaigns, and deliver results to the bottom line. Visit clarity.global to find out more.

 

How politics can mess with data 

 

Alex: (12:43)

Tell us a bit about how you see politics come into play when different teams, stakeholders who are beginning to build a strategy or start to embrace a more data driven culture.

 

Neil: (12:54)

All the time. We’re human. I mean, it’s nice to say we’re rational people. We’re not. And you know what I see time and time again is that, everyone is interested in finding the right answer until, as we have right now, difficult economic times, where let’s just say you’re looking at readjusting the amount of employees, headcount at your company. 

 

And all of a sudden, somebody puts a proposal to say your team should be smaller. Your bonus should be less. Now all of a sudden emotions come into play. You want to defend your team, your objectives, your KPIs, how you feel, you understand the gaps in measurement. What’s not being reflected is the potential going forward. 

 

And you want to win that argument more than you want to provide a dispassionate view on the data, especially if your job and your resources are on the line. There’s no incentive for you to be rational. Your incentive is to keep the status quo. And this is the issue largely that I have whenever companies come out and they say, we’re going to let the data decide. 

 

And because the problem there is that that implies to say, we have the data and now let’s talk about the implications and the changes when those emotions get involved. And so the best advice that I have for anybody listening, and this is a step that’s missed by most companies I work with, is before you run an experiment, before you implement software, before you engage with a new partner, if there is any degree of uncertainty that data will help resolve, if we can certainly wait for that data to come in, but plot out the different courses of action based on the possible results of that data market research, a new data set, an experiment, whatever that is. 

 

Because if you can’t get people aligned at that point about what action you’re going to take on that data, it’s certainly not going to come after the fact. Everyone has those rose coloured glass saying, it’s going to be my team that benefits from this data, this experiment, but everyone has that point of view. 

 

And for the most transformative changes you can make to your business, generally those involve large reallocations of resources. If you’re talking, we’re gonna give everybody one or 2% less resources. That’s not a large scale transformation. That’s just an iterative change. I’ll move a little money into a bucket. 

 

When you start moving 10 or 20% of resources around saying from offline to digital or from one product to another product, you need to have that core set in advance to say, when that data comes in, we know what we’re going to do, and we’ve agreed with what we’re going to do. So it doesn’t require additional debate. 

 

And if people can’t get aligned where they don’t feel comfortable with the methodology or the course of action, that should be a red flag for you to say, you want to put that particular change on hold because it’s unlikely that you’ll see any benefit when it goes through.

 

Alex: (15:33)

It’s something I see quite a lot in the world of any form of data as being overloaded with volumes of data that are being collected and this tendency to just go and try and measure everything. And I’m the first to admit that I get lost in a Google analytics account for far too long.

 

Neil: (15:47)

You just want to browse through…

 

Alex: (15:50)

Yeah, it’s interesting, right?

 

Why analytics shouldn’t be left to the data scientists 

 

Neil: (15:53)

That’s a Hollywood view of data, right? You put a really smart person in front of three monitors and then just data flashes and sales went up in this neighbourhood in Brazil and then they track down these customers and their needs have changed. I mean, that’s the process of data. I think, here’s the extrapolation of it. 

 

I think that companies view data as a important commodity that’s part of their business, that eventually they will figure out how to monetise, but oftentimes those people making that decision and those investments are not necessarily the ones that have to monetise it. Somebody else will do something with the data, right? And you never see a worse group of data scientists. 

 

And when you sit down with them and you’re like, this actually happened, a CMO told me this one time and I asked her, what’s gonna happen with all this data you collect and you’re integrating and building all these cloud systems? 

 

And she’s like, we’re going to hand it over to the data scientists. And I said, and what are they gonna do? And they’re like, they’re going do data science on it. To what outcome, to what end?Well, that’s what they do, that’s what smart data scientists do. And you see this in job postings too, right? With data scientists. 

 

Like you need to know all the technical elements of data, and you also need to be able to understand our problems and craft solutions. The reality is that most companies need to start with the hypothesis. What do you think is happening? What data do you need to prove or disprove that hypothesis? And how are you going to act as we were just talking about based on what you find, because if you’re going blindly in and saying, we just wanna look at disparities in numbers where something goes up or something goes down, that’s interesting. 

 

But again, what you’re doing at that point is you’re forming hypotheses. Oh, sales went up in Brazil, what’s a hypothesis as to why it went up? A competitor dropped out of the market and aren’t selling in that area? Okay, great. But that’s also a terribly inefficient way to build hypotheses. 

 

To blindly look through data, I’m sure if you ask, and this is where the importance comes in, when you structure things around hypotheses, not just digging into data, you can solicit more people in your organisation who can offer their perspectives around what they think is happening. 

 

And then to go through and to say to the frontline salespeople, to the marketing people, what do you notice? What do you think we should test? And then you get to buy into their point of view, what they’re seeing in the field. 

 

And you can use your data scientists at that point to say, your goal is to say, can we prove or disprove this hypothesis? Do we have the data to validate this and having a course of action to say, if you’re able to validate this, here’s how we’re going to change our organisation so we can prioritise those requests. 

 

Because now you’re not expecting the data scientist to understand what’s happening in the field or the broader industry. You’re expecting them to do what they do best. That’s the data science stuff, but you’re kind of giving them that guidance and that clarity to say, this is where you get something out of it. 

 

Otherwise you end up with a whole bunch of data that you’re just sitting there, hoping that people who are removed from the actual job work of selling your products or marketing your products, you hope they can figure it out. And then you hope that they don’t get overwhelmed by what they have. And so moving to more of that test driven culture is my solution.

 

How to get the best from your data 

 

Alex: (19:07)

And how do you get to that culture? In theory it sounds great, but in practice, as we’ve said, we’re not rational, emotions get in the way people are incentivised in different ways. People are looking out for themselves. There’s all kinds of different drivers and forces acting on all of us as people and as teams. As we talked about, it may be easier in a small organisation than a big one, but any tips for making it happen in reality?

 

Neil: (19:32)

I would say it starts with leaders at the very top. So we don’t outsource this. You don’t say there’s an experimentation team. I don’t wanna see that. What I wanna see is people at the top of the organisation, the presidents, the VPs, however the organisation is structured, where their single goal should be. 

 

Like if you wanna extract the most value out of your employees, out of your team, what you need to do is you need to collect those hypotheses, which is effectively going to your team as much as you might go to your executive team around a table, going to those analysts that may have a very difficult time, surfacing their ideas and saying part of your job is to get me your hypotheses about what we should be doing. 

 

Now you’re gonna get a laundry list. And certainly you want to filter that. So I like to ask these three questions on top of it. Not only do you want the hypotheses of people in your organisation of what you should be doing differently, you want to be able to rank them, right? You’re gonna get a phone book of a thousand ideas. 

 

There’s no executive that can go through it. So you want to ask for a little bit of clarification. One is, do you have data? And if so, what is that data that supports your hypothesis? Is this field research that you’ve seen, is this a data set? Is this a case study from an employer? What do you have that supports it? Or is it intuition because I want to be able to rank it? How do we test this idea? 

 

So if you have an idea, how, how do we test it? And then you get filters to say, I’d love to do this. But if your test is that we have to develop and build a warehouse in a new country that may be less attractive to me at this point than something I could run on my website, and the third is, as we were talking about, if this does in fact work, what do we do differently? Do we have a clear course of action and a clear consensus? 

 

Now, those three questions are really just going to frame to give you a sense of the attractiveness of the hypotheses, but really think about your role as a leader here. If you have several hundred people in your organisation who are finding new opportunities for growth and are giving you a roadmap for how you validate and implement that growth, what else do you need as a leader? 

 

Like why is leadership about, I’m gonna sit in my office and think about what we should be doing here. You’re really encouraging in a structured way, the participation of your data scientists, your frontline sellers, your marketers, your product managers to say, what do you see from where you’re sitting? 

 

And where it all hinges on is the fact that in organisations it’s very difficult to surface ideas, even inside Google. If I surface an idea to my leadership, I assume that the other people on my team will also be surfacing their ideas. And then just one level up, they may have 50 ideas and they’re sitting there thinking, God, I can’t give 50 ideas to my VP. I’ll narrow down to the top three. 

 

Then the VP gets the top three ideas from 10 people on her team. And now she’s got 30 ideas. Like I can’t go to the president with 30 ideas. I’ll pick the top three out of that. And now what you send the messages to say, if this team has a hundred ideas, we’re going to think about three of them. 97 of those data driven testable ideas that could change our business won’t even be surfaced or considered. 

 

And you think about the message that sends even to analysts in your organisation using that data. Why bother? Just tell me, what do you want me to look at? You want a dashboard on sales share, I’ll build a dashboard. It doesn’t make sense for me to find new opportunities because you won’t listen. And so this is a purpose for me to change there.

 

What kind of person makes the best data-driven marketer? 

 

Alex: (22:52)

I wanna wrap up by talking a bit about people and the people that we think make the best data driven marketers. 

 

You’ll be familiar with the fact that, I guess in the marketing world, the B2B marketing world, there’s a constant debate or discussion around is marketing becoming a purely data driven discipline sometimes at the cost of creativity and ideas potentially? 

 

But I think we’re all familiar with data sitting at the heart of everything. Is there a certain type of person that you think makes the best data driven marketer?

 

Neil: (23:25)

I am a strong advocate for diversity across teams. And I look at it to say, if you have two teams competing, they’re using the same data, the same type of talent with the same experience. The question you’re gonna ask is how do we get something different, something better than the market? And that comes from the diversity of experience you need to add to that team. 

 

I think companies in the pursuit of efficiency love to think about themselves as a finely tuned machine, but the fact that marketing isn’t necessarily something that falls into that bucket. I worked with a gentleman who had his doctorate in physics. He was a literal rocket scientist, and I kind of joked with him as I do, I say I get people to click on pictures, that’s my job. You can build rockets. 

 

Like, I feel like there’s a certain hierarchy where it’s like click on pictures, build rockets. And he actually had an observation that our job as marketers was more difficult because there’s no constant in our work. Consumer patterns and behaviours and interests change all the time. And when we only look at historical data sets, we see what happened yesterday. We need new ideas to inform what we’re going to do and collect data with tomorrow. 

 

And so where it makes people uncomfortable is that they have this idea that we can precisely measure and model everything. And I think when you do that, you close yourself off to all the possibilities, all the experiments and explorations you could have beyond it. And so you look at your team and saying the more diverse your team, the more perspectives they’re going to have, which are going to be beneficial. 

 

But you have to trust that that payoff is there. Because if you look at it in the model to say, this person is different than this uniform team I’ve been working so hard to build. It’s not that person’s problem. It’s a leadership problem. You’re trying to build a uniform team in a market where one shouldn’t exist.

 

Alex: (25:18)

That is great advice. And a thoughtful perspective. I think I came into this episode thinking it would be fairly tactical data driven stuff. And I think it’s actually much more than that in terms of a look at organisational structures and people and processes and change. 

 

And there’s a ton of great advice you shared there for not just embracing data as an organisation, but I guess tackling any kind of complexity or problem or big project. So, yeah. Fantastic advice. Thank you for joining Neil. Thanks again for sharing your time. Do you wanna wrap up by pointing people to where to find your book?

 

Neil: (25:50)

Oh, the book, Converted: the data-driven way to win customers hearts, can be found certainly on Amazon anywhere. You can take a look at the book’s website and download an example chapter at convertedbook.com. You can also find me to level your thoughts, your criticisms on LinkedIn, where I’m especially active.

 

Alex: (26:06)

Awesome. Thank you, Neil.

 

Alex: (26:09)

Thanks for listening. Before we go. Just one final shout out to our finite partner, 93x, the digital marketing agency working exclusively with ambitious fast growth, B2B tech and SaaS companies visit 93x.agency to find out how they partner with marketing teams to drive growth. 

 

We’re super busy at FINITE building the best community possible for marketers working in the B2B tech and SaaS sector to connect, share, learn, and grow. Along with our podcast, we host online events, share content and have an active Slack community with members from around the world, including cities like London, New York, Singapore, Tel Aviv, Stockholm, Melbourne, and many more. Head to finite.community and apply for a free membership to strengthen your marketing knowledge, build your network and connect with ambitious B2B tech marketers across the globe.

Related Posts

B2B marketing with a commercial mindset, with Oliver Pilgerstorfer, CMO at IFS

By Jodi Norris 04 December, 2022
Oliver Pilgerstorfer, CMO, shares his journey at IFS so far, what changes he's enacted in terms of brand and demand…

Ecosystem marketing for better alignment with Allison Munro, Chief Marketing and Ecosystem Officer at Vena

By Jodi Norris 07 November, 2022
Over the past decade, we've seen momentous shifts in B2B organisations, and particularly the marketing function, which has evolved from…
Older Post

Emotion-led B2B tech marketing with Richard Maclauclan, VP Brand at Workhuman

Newer Post

An honest account of being a CMO, with Tom Furr, CMO at DISCO