Marketing Analytics: Methodologies Trump Metrics!

If a conversation is occurring about Data and Analytics, chances are high that it is about Metrics.

About how abhorrent Vanity Metrics are.

About the marginal value of Activity Metrics.

About how crucial a focus on Outcome Metrics is.

About Metrics for Dashboard – NO! Only KPIs for Dashboards.

About the difference between KPIs and Metrics. :)

About the need for a balance between Acquisition Metrics, Behavior Metrics, and Outcome Metrics in our Scorecards to paint the full customer journey – and kill silos.

And… that’s just me driving all those discussions!!

You are surely having many, many, more that focus on Metrics.

Metrics are important.

My goal is to persuade you that Methodologies – how you measure a metric/KPI — are exponentially more important.


This blog post was originally published as edition #435 of my newsletter TMAI Premium. Each week, my newsletter shares strategic frameworks and practical here's how to stay at the very bleeding edge of CFO-proof Marketing and Analytics. Sign up for TMAI Premium to accelerate your career trajectory.


The Importance of Metrics.

Conversion Rate is a standard metric in many analytics tools.

It can be measured as: Orders/Sessions.

I don’t like that, it presumes that every open of your App, every visit to your Site is an opportunity to convert. It breeds short-termism. It insults the customer journey.

Hence, in my first book, I advocated for: Orders/Users.

If a human has four or five sessions between your App and Site and then converts, that is ok. Let’s measure the success of that journey.

Among many benefits, it encourages your Marketing, Site, Service teams to create an experience that works at the same pace as your customers – vs. every time they show up BUY NOW, RIGHT NOW!

How you define metrics, matters. For measurement, and culture.

When someone says ROAS, ask: Can you please share the formula you are using to compute that metric?

Then, cry after you realize everything it is missing.

Wipe your tears.

Notice, ROAS is expressed as Average ROAS.

Resist the urge to start bawling again.

Take a deep breath.

Ask for Marginal ROAS. It’ll change your life.

[Note: TMAI Premium members, please refer to the invaluable #433: Compute Marginal Returns! If you can't find it, please email me.]

How you define metrics, matters.

No question.

The Incredible Importance of Methodologies.

You can measure how many Conversions were driven via your digital presence using Last-Click – the standard approach in Digital Analytics tools.

Last-Click is the Methodology. Conversions is the Metric.

Last-Click gives all the credit for the Conversion to the last referrer of the Session in which the Conversion happened. I might have come to the site five times, as mentioned above, but my last click was on an email from Kate Spade. The Last-Click Methodology will give the entire credit for my conversion to email marketing.

Sad. And, a lie.

To fix this, 15 years ago (!), Google Analytics released Attribution Modeling as an additional Methodology.

This was great. Now, all five of my visits to katespade.com are tracked by GA, and credit for my conversion is distributed across SEO, Email, Bing Paid Search, Affiliate, and Email.

A better Methodology!

Conversions Driven by Email are different under Attribution Modeling vs. Last-Click – one is better for the business, the other will fuel unwise decisions.

It gets more complicated now. :)

Attribution Modeling, when launched, had a plethora of Methodologies that could make the Conversions report better or worse!

1. First-Click. 2. Linear. 3. Time Decay. 4. Position Based. 5. Custom. 6. DDA.

As humans tend to, we used all of them!

The metric did not change, it was still Conversions, but choosing the Methodology caused good or bad decisions.

Ex: I used to say:

First-Click attribution is like giving my first girlfriend 100% of the credit for me marrying my wife!

That is how ill-advised using that Methodology was.

Luckily, the team at Google realized the awfulness, killed all the silly distractions, and standardized on Data-Driven Attribution Modeling – a machine learned model across millions / tens of millions of consumer journeys of your actual customers.

The Methodology can get better still!

An Attribution Modeling solution that included Online and Offline Conversions was a better Methodology than what exists in GA.

Then… You started to read what I preach, and you realized Attribution was not a great methodology.

Attribution is not Incrementality! [TMAI Premium Subscribers, please See #232.]

If you want to know what happens if you cut the entire Marketing budget to Zero, the only methodology that will answer that question: Portfolio Incrementality.

So… How should you measure Conversions (or Revenue or, better still, Profit)?

Incrementality.

It is the only Methodology Finance will accept. Because it is the only one that gets to true causality.

See what I mean?

The Methodology is far, far, far more important than Metric.

A mistake with a Metric’s definition or adding a poor one to your Dashboard will hurt.

A mistake in choosing the Methodology for the metric will (metaphorically) kill you.

Here’s an image from last week's Premium discussion on Path to MMMs…

Attribution - Incrementality, Implications

Over the last decade, I’ve obsessed about Brand Marketing. Here’s an explanation as to why.

If you are doing Brand Marketing right, you are measuring one of only three KPIs:

1. Unaided Brand Awareness. 2. Consideration. 3. Purchase Intent.

There are 7 million other metrics, 6.99 m of which are expensive distractions. Just three rise to the stature of a KPI.

Let’s say you lead Brand Marketing at Chase bank, and your CMO has strategically decided to focus on Purchase Intent.

Great.

Spend a little bit of time dealing with the frustration that when you measure Purchase Intent across three ad platforms, they have four entirely different definitions for Intent!

Brand Lift Studies

Rationalize this. Sure.

But.

Inside your company, when you see a Campaign report being presented to your CMO… Before she reacts to it… Ask your Agency the question:

What Methodology did you use to compute Purchase Intent?

They will say: The Brand Tracker.

Shut the meeting down!

The Brand Tracker is possibly the worst way to measure the Lift in Purchase Intent delivered by your campaign.

Brand Trackers collect a relatively small sample, only periodically (say once or twice a year). Both of these bits are 100% ok.

The problem is that the Brand Tracker’s scope is to measure company-level Purchase Intent (and other KPIs). This flavor of Purchase Intent is influences by many, many, many factors…

Components of Brand Tracking Studies

Your Competitors drive how many people buy from you. Your CEO was accused of harassment, big news cycle, that impacts Purchase Intent. Other bits above. And… Marketing.

In a Brand Tracker, it is impossible to identify the causal impact of a campaign.

Wrong methodology for that.

Causal impact of a Marketing campaign on your chosen Brand KPI of Purchase Intent will be measured using true Test < > Control Brand Lift Surveys.

Still… So many, experienced (!!), Measurement Experts continue to use the wrong methodology.

[Note: Premium members, please activate the detailed advice in TMAI #356: Brand Measurement's Ladder of Awesomeness. Ping me if you can't find it.]

Last quarter, I saw a client dashboard where they were reporting the same Brand Metric for each country in the CMO dashboard, so it looked like you could compare apples-to-apples-to-apples. But, when I asked which Methodology was used, it all fell apart as each country used their own sweet approach.

Last month, it was this Expert in a large Asian country trying to persuade me how his measurement of Brand Lift was ok with the Brand Tracker because he was using “Pre-Post to isolate the campaign impact.” There is so much wrong with this, it is not even funny.

Last week, when someone vehemently tried to justify the use of Brand Trackers to measure Campaign-level impact because they were basing it prompted ad recall… I burst into tears, that is how bad that idea is.

Here’s how explained the poor choice of the methodology:

A. I live in Camden. I can walk down to Kings Cross station, take the Eurostar to Paris, and be there in 2.5 hours.

B. I live in Camden. I can walk down to Camden Road station, take the Northern Line train to Tottenham Court station, switch to the Elizabeth Line station, and disembark at Heathrow Airport station. From there, I can take a flight to Dakar, Senegal. Get off, buy a motorcycle, and cross the Sahara desert to Algiers. Then I jump into the Mediterranean and swim to Marseille. Finally, I walk to Paris. Even if I don't die along the way, imagine the shape I'm in.

Choose A.

Choose the simpler Methodology.

If you do, chances are significantly higher that you’ll identify causal performance. Which will lead to better decisions.

In both cases, the Metric would have the same name, Purchase Intent. That is what you have to watch out for.

Always, always, always ask for the Methodology.

And, torture the Methodology.

In my example above, you would use true Test < > Control BLS to measure my fav brand Metrics: Points of Lift, Number of Individuals Lifted, and Cost Per Individual Lifted (CPIL).

But, ask the BLS provider:

1. How many respondents were in the sample?

2. What was the response rate?

3. When did you offer the survey, after the first ad impression, after three, at the end of the campaign?

4. How do you calibrate the audience members to ensure a representative sample?

5. What method you use to extrapolate from the 225 responses to our 12 mil target audience?

6. How did you ensure that you collect and establish the baseline on your channel before you start the campaign?

I have like a dozen more of these.

I ask them to ensure that even when the platform is creating true test and control groups, the way the methodology is operating is super-high-quality.

Analysis Ninjas spend time on this.

Methodologies matter the most.

The Super Awesome Methodologies Matrix.

Not everyone in an organization is at the same level of sophistication in relation to strategic analytics – that is A-OK.

In an earlier role, our CFO encouraged us to create something simple that the entire organization could use to check the quality of the results BEFORE they sent her dashboards claiming earth-shattering results.

It was a great challenge. The company has major operations across 20+ countries, and 14 distinct B2B, B2C, A2Z businesses.

What the CFO received were Metrics. But, I’d quickly decided that standardizing the Metric and the definition would not influence the behavior our CFO wanted.

Instead of standardizing Metrics, I choose to focus on standardizing Methodologies.

The result was a “matrix” that looked like…

ROI Methodologies Matrix

On Y-axis, the quality of the measurement being used.

On the X-axis, the quality of the inputs (a proxy for the scope of analysis).

The cell colors represented an explicit ask from the CFO:

I want to know who needs improvement, where we are industry average, where we are following industry standards, and where we are leading the entire industry by innovating.

Tough ask, no?

A lot of research went into creating it, a lot of debates about wait, is that yellow or green, and lots of inviting external experts to challenge our thinking.

By plotting each methodology in these cells, we were able to create something simple that solved both problems:

A. Identified clearly the quality of what was being reported today.

B. Showed a path for improving the current methodology (in case it fell in red or yellow).

Here’s a partial view to give you a sense for what it looked like…

ROI Methodologies Matrix - Partially Filled In

Simple.

Super effective.

Everything except red is ok.

But, why would you want to rest in yellow. :)

We use it actively with our clients now, in the spirit of kaizen.

[Note: If you are a Subscriber to TMAI Premium, and would like to see the version with 26 additional Methodologies in the matrix above, just email me and I’ll be happy to share it.]

Bottom line.

Obsess about the Methodology.

It is the pinnacle of knowing what you are doing.

Carpe diem.

Add your Perspective

*