There are lots of different tools, many of which measure the same thing, numbers don't tie, there is confusion and a lack of trust and…. well I am getting ahead of the myself.
The story starts with a email asking for thoughts on a issue a lot of us face on a daily basis.
Why don't I let the original author tell the story……..
Can I ask you a question? I'm fighting mightily to make HBX able to measure marketing channels—pay per click, organic, email, banner ads, and onsite merchandising. But individual channel managers simply do not trust the data.
I think that data overall is damn accurate—total conversions by product is within 10% of in house measurement. I have a sneaking suspicion that web analytics tools are NOT the best measurers of true conversion by channel.
The data never matches what they get from their pay per click tool, their email tool etc. They blame the web analytics tool, when I think it's just that they are expecting something that can't be done.
Have you had any experience with this phenomenon and do you have any advice on how to dive in and tackle to diagnose?
This is a tough problem. Very real, very tough.
[Before we get too deep I wanted to highlight that this is as much a data problem as it is a organization problem.
For now Web team is usually in a silo, then Acquisition tends to be in a different silo and further more it is not uncommon for different "Channels" to sit in their own homes that may or may not work / collaborate with each other. The corrosiveness that organization structure can cause is often underestimated.
Data gets blamed and worse any possible positive outcomes don't, well, come out. I wanted stress this important element.]
So what to do?
Perhaps the most important thing to realize in this case is that your web analytics tool is often not measuring the same thing as your offsite tools are (for example: ppc, banners, affiliates etc). Even success (conversion) is not measured the same way.
Examples of issues causing differences:
# 1 : Often many "off site" tools measure impressions (completely missing from web analytics tools) and clicks. For the latter if you code correctly then clicks come through to your web analytics tool but remember that there you are usually measuring "visits" and "visitors" and not clicks.
# 2 : A good example is conversion. A ppc / banner / affiliate cookie is usually persistent and will "claim" sales even after ten visits of the customer, including for scenarios where the customer might have came back repeatedly through different channels (first paid then organic then banner than paid again then affiliate then….. you get the idea).
Most web analytics tools will credit the first channel or the last one creating one more bone of contention with "off site" tools.
[As a bonus in this scenario, #2, your paid search vendor and your banner vendor and your affiliate vendor all "claim" credit for conversion even though it was just one customer that converted!!]
My approach at solving is really quite simple. Five easy to execute steps:
Step One: Understand what each tool is actually measuring and how it is doing it. This means really getting down and dirty with the data and the vendors and pushing them hard to explain to you exactly what they do. If you are not a smidgen tech savvy then take your friendly IT Guy with you.
Step Two: Document. Create a slide / email outlining all the reasons why the numbers don't tie. Be intelligent, be creative. Here are two slides I had created from a long long time ago, a real blast from the past….
The slide above provided a summary of why the two data sources were providing numbers that were 30% off. As you can imagine took a lot of work.
This slide explained one of of bullet items (1.II.b), how each tool dealt with the sessionization when it came to search engines (and with increasing dominance of search engines this turns out was a big problem)
Notice the "pretty picture", it shows a very complex process with great clarity.
Step Three: Educate your users (in this case the data skeptics). Do dog and pony shows. Present to the senior management teams, or anyone who will touch the data. Leave them easy to refer to presents such as the slides above. Make sure that your audience is now smarter than you are because that will build trust (both in the data sources and, more importantly, in you).
Step Four: Start to report some high level trends between the tools, while doing your best to hide/remove the absolutes from any report / dashboard. Especially in comparisons one month won't be good enough and it might even be distracting.
Often it is not uncommon for me to index the numbers them in some way and compare trends, rather than show the raw numbers . So xxx campaign / program went up in comparison to others by xx points (or xx percent) etc. To take the focus away from numbers.
It will take a while for them to get comfortable with the numbers, depending on the organization complexity, politics and how good a job you have done in Step Three.
Step Five: Pick your "poison".
This is a key step. You need to wean your organization from relying on multiple tools for the same data. The goal is not to kill different tools (unless they are utterly redundant) but rather not to have the same Metric from different tools.
You need to pick the best tool for the best metric.
It is ok to let your Acquisition team use DoubleClick / Atlas / Whatever to report end to end on your banner / display / digital media campaigns (make sure they are following customers through conversion).
Ditto for your Pay Per Click (PPC) / Search Engine Marketing (SEM) team. If you are big then your agency probably has lots more data in its own possession than you could have, and they probably want to move faster than you could. Don't compete with them.
Some web analytics tools do promise that you can bring in any data and merge it with your web analytics tool and it will work wonderfully. Sometimes it does, often it does not.
Just pick your poison, best cleanest possible source of data for each job and use it.
If you are the Web Analytics person should you just quit and go home? No.
Focus on data that you have from your web analytics tool that they (other tools / teams) don't have.
For example: What is happening on the site after these skeptics deliver Visitors to your site? Use site overlay, bounce rate, content value, testing, all things they can't do, don't have access to but in the end help them understand what is going on on the site and how they, skeptics, can make more money from your data.
Most siloed tools (PPC, Banners, Affiliates… etc) don't can't actually do deep site analytics, so forget the tactical, they will do that better, focus on the strategic. Rather than become combative be "collaborative".
You do that and you have your hook! No one will argue with you about differences in data.
In summary: While this is a tough problem that we all face, it is possible to bring about some sanity to the existence of multiple data sources. It is not possible for now to just have one tool for all your needs, but if you follow the above five step process then you'll be able to bring out the best in each and benefit from it.
Ok now its your turn!
Please share your perspectives, critique, additions, subtractions, bouquets and brickbats via comments. Thank you.