Kill Useless Web Metrics: Apply The "Three Layers Of So What" Test

ThreeData, data everywhere yet nary an insight in sight.

Is that your web analytics existence?

Don't feel too bad, you share that plight with most citizens of the Web Analytics universe.

The problem? The absolutely astonishing ease with which you can get access to data!

Not to mention the near limitless potential of that data to be added, subtracted, multiplied, and divided to satiate every weird need in the world.

You see just because you can do something does not mean you should do it.

And yet we do.

Like good little Reporting Squirrels we collect and stack metrics as if preparing for an imminent ice age. Rather than being a blessing that stack becomes a burden because we live in times of bright lovely spring and nothing succeeds like being agile and nimble about what we collect, what we give up, and what we deliberately choose to ignore.

The key to true glory is making the right choices.

In this case its making right choices about the web metrics we knight and sent to the battle to come back with insights for our beloved corporation to monetize.

A very simple test can allow you to figure out if the metric you are dutifully reporting (or absolutely in love with) is gold or mud.

It is called the Three Layers of So What test. It was a part of my first book, Web Analytics: An Hour A Day.

What's this lovely test?

Simple really (occam's razor!):

Ask every web metric you report the question "so what" three times.

Each question provides an answer that in turn raises another question (a "so what" again). If at the third "so what" you don't get a recommendation for an action you should take, you have the wrong metric. Kill it.

This brutal recommendation is to force you to confront this reality: If you can't take action, some action (any action!), based on your analysis, why are you reporting data?

The purpose of the "so what" test is to undo the clutter in your life and allow you to focus on only the metrics that will help you take action. All other metrics, those that fall into the nice to know or the highly recommended or the I don't know why I am reporting this but it sounds important camp need to be sent to the farm to live our the rest of their lives!

Ready to rock it?

Let's check out how you would conduct the "so what" test with a couple of examples.

Key Performance Indicator: Percent of Repeat Visitors.

You run a report and notice a trend for this metric.

Here is how the "so what" test will work:

"The trend of repeat visitors for our website is up month to month."

So what?

"This is fantastic because it shows that we are a more sticky website now."

(At this point a true Analysis Ninjas would inquire how that conclusion was arrived at and ask for a definition of sticky, but I digress.)

So what?

"We should do more of xyz to leverage this trend." (Or yxz or zxy – a specific action based on analysis of what caused the trend to go up.)

So what?

If your answer to that last "so what" is: "I don't know… isn't that a good thing… the trend is going up… hmm… I am not sure there is anything we can do… but it is going up right?"

At this point you should cue the sound of money walking out the door.

Bottom-line: This might not be the best KPI for you.

Let me hasten to point out that there are no universal truths in the world (though some religions continue to insist!).

Perhaps when you put your % of Repeat Visitors KPI to the "so what" test you have a glorious action you can take that improves profitability. Rock on! More power to you!

Key Performance Indicator: Top Exit Pages on the Website.

[Before we go on please know that top exit pages is a different measurement than top pages that bounce.]

You have been reporting the top exit pages of your website each month, and to glean more insights you show trends for the last six months.

"These are the top exit pages on our website for the last month."

So what? They don't seem to have changed in six months.

"We should focus on these pages because they are major leakage points in our website."

So what? We have looked at this report for six months and tried to make fixes, and even after that the pages listed here have not dropped off the report.

"If we can stop visitors from leaving the website, we can keep them on our web site."

So what? Doesn't everyone have to exit on some page?

The "so what" test in this case highlights that although this metric seems to be a really good one on paper, in reality it provides no insight that you can use to drive action.

Because of the macro dynamics of this website, the content consumption pattern of visitors does not seem to change over time (this happens when a website does not have a high content turnover – like say a rapidly updating news site), and we should move on to other actionable metrics.

Here the "so what" test not only helps you focus your precious energy on the right metric, it also helps you logically walk through measurement to action.

Key Performance Indicator: Conversion Rate for Top Search Keywords.

In working closely with your search agency, or in-house team, you have produced a spreadsheet that shows the conversion rate for the top search keywords for your website.

"The conversion rate for our top 20 keywords has increased in the last three months by a statistically significant amount."

So what?

"Our pay-per-click (PPC) campaign is having a positive outcome, and we should reallocate funds to these nine keywords that show the most promise."


That's it.

No more "so what?"

With just one question, we have a recommendation for action. This indicates that this is a great KPI and we should continue to use it for tracking.

Notice the characteristics of this good KPI:

#1: Although it uses one of the most standard metrics in the universe, conversion rate, it is applied in a very focused way – just the top search keywords. (You can do the top 10 or top 20 or as many "head keywords" as it makes sense in your case, just be aware this does not scale to the "mid" or "tail".)

#2: It is pretty clear from the first answer to "so what?" that for this KPI the analyst has segmented the data between organic and PPC. This is the other little secret: no KPI works at an aggregated level to by itself give us insights. Segmentation does that.

Key Performance Indicator: Task Completion Rate.

You are using a on-exit website survey tool like 4Q to measure my most beloved metric in whole wide world and the universe: task completion rate. (You'll see in a moment why. :)

Here's the conversation…

"Our task completion rate is down five points this month to 58%."

So what?

"Having indexed our performance against that of last quarter, each one percent drop causes a loss of $80,000 in revenue."

So what? I mean in the name of thor, what do we do!

"I have drilled down to the Primary Purpose report and most of the fall is from Visitors who were there to purchase on our website, the most likely cause is the call to action on our landing pages and a reported slowness in response when people add to cart."

Good man. Here's a bonus and let's go fix this problem.

Nice right?

Notice in this case you have a inkling to the top super absolutely unknown secret of the web analytics world: If you tie important metrics to revenue that tends get you action and a god like status.

Keep that in mind.

So that's the story of the "so what" test. A simple yet effective way of identifying the metrics that matter.

This strategy is effective with all that we do, but it is particularly effective when it comes to the normal data puke we call the "management dashboard". Apply the "so what" test and you'll make it into a Management Dashboard.

Closing Summary:

Remember, we don't want to have metrics because they are nice to have, and there are tons of those.

We want to have metrics that answer business questions and allow us to take action—do more of something or less of something or at least funnel ideas that we can test and then take action.

The "so what" test is one mechanism for identifying metrics that you should focus on or metrics that you should ditch because although they might work for others, for you they don't pass the "so what" test.

And killing metrics is not such a bad thing. After all this is the process that has been proven to work time and time again:

web analytics metrics lifecycle process

More here: Web Metrics Demystified.

Ok now it's your turn.

Do you have a test you apply to your web metrics? What are your strategies that have rescued you during times of duress? What do you like about the "so what" test? What don't you like about it? Do you have a metric that magnificently aced the "so what" test?

Please share your comments, feedback and life lessons via comments.


Couple other related posts you might find interesting:


  1. 1

    It reminds me of a customer service concept: about how each piece of conversation with customers – even a phone message, should include:

    1. Position
    2. Action
    3. Outcome

    "Hi, this is Ric (Position) – please tell me about your issue (action), so that I can help you solve your problem (outcome)."

    In other words, our reporting of metrics should tell a story. If they don't tell a story, they are meaningless – they don't answer "so what".

  2. 2
    ElZarcho says

    "Let me hasten to point out that there are no universal truths in the world (though some religions continue to insist!)."

    Oh, there are a few universal truths; but I don't think Avanish is qualified to discuss metaphysics. I'd suggest leaving such little comments out of discussions of web analytics. They add no color or insight to the article, and you're clearly not well enough studied in either philosophy or theology to defend your point.

    • 3

      Actually….I love the bits of humor that Avinash adds into his writing. This humor is one of the things that keeps me actively engaged and entertained.

      ElZarcho you're obviously offended, which is something you should address within your own self.

      • 4

        I totally agree with you, Theresa.

        It's the touch of humor that draws me to Avinash's blog. Keeps the subject from getting too dry.

  3. 5

    Dear Avinash,

    As ususal, Occam's Razor made the clear cut. Very good essay on actionable metrics.

    Recently we had a customer which presented us a list with 120 different metrics from his site, headed "KPI List". I would be interested in the three-layers-test on a list like that, at the end some cases are hopeless I guess (needless to say that we recommended to trash the entire list and start over …)

    Warm Regards,

  4. 6

    I love this test! I'm not a numbers person so I look at my analytics and don't know what to do.

    This test helps me make sense of it.

    Thank you.

  5. 7

    It always pays to go behind the numbers are truly understand what they mean. "So what" is a great way to understand those KPIs. Additional insights can also be obtained by comparing KPIs across channels and questioning the difference in customer behavior. I guess its all about asking questions.

    Avinash – Are you also saying that the analysis is more actionable and easily consumable when it is targeted?

  6. 8

    I remember the "So What" Test! I remember reading about it in your first book and thinking "Yeah that makes sense". This was on the plane to Google in October 2007. Wow that was 2 1/2 years ago already!

    My concern is when a higher-up / HiPPO just "wants" that Daily Unique Visitor count, or when that client just wants that list of the top 5 exit pages. The "so what" test would go something like this:

    "My Top 5 Exit Pages are the homepage, about us, contact us, services, and portfolio pages."

    So What?

    "We need to fix these pages ASAP because everyone is leaving our site!"

    So What? Everyone has to leave at some point!

    "Well, this is the report that I WANT!"

    So What? It means nothing and you can't take any action from it and it's just clutter!

    "Well…ummm….I just want it, OK?"

    I think that you have to pick your battles with the "So What" test. Or, just turn a blind eye to it at some level. Sometimes stubbornness prevails and we (the consultant / analyst / person who's getting paid by the customer) has to give and take when appropriate. However the "So What" test should always, 100% of the time, be used internally and for you, personally.

    Unfortunately, from time to time, no matter how strong your evidence is, sometimes someone just wants something, even if it's useless :(

  7. 9

    I read your book Web Analytics: An hour a Day and I am glad to see that this method is still valid and won't go away anytime soon!;)

    Great post and always happy to stay up to date with your insights.

  8. 10

    Well stated. If the data isn't actionable AND profitable, then…so what?!

    I have found that ranking my recommendations in terms of profitability helps get traction with higher ups. My favorite discussion point is "why do you think this change will make us so much money?" At that point we can really start talking because I've got their attention.

    I may have to adjust my numbers, but the end result is always ACTION, as opposed to those blank stares we may be accustomed to seeing…

  9. 11

    Thanks for writing this – this is a fantastic post, and just about a week ago I was doing a search for your "three layers of so what" from your book to include on my blog's "reading list" page, and I couldn't find a post like this to include.

    One question: asking "so what" three times of one piece of data is a very high standard, and I understand that it encourages you to throw out unactionable data, but is there any wiggle room? Is two ever enough?

  10. 12
    Alice Cooper's Stalker says

    I completely agree with Joe's comments and the scenario he described. I was thinking about writing almost an exact duplicate response to your post, Avinash, before I saw his. Thanks for saving me some time, Joe.

    I think that the HIPPO standard statement (whether stated verbatim or implied) of 'I just want it!' can signify a couple of different things to an analyst:

    1. The HIPPO isn't knowledgeable about this area and is making uneducated decisions. (They don't read Occam's Razor)
    2. The HIPPO doesn't trust you or your recommendations. They do trust you to build a report for them.
    3. The HIPPO is thinking about using this data for something, but they are unwilling to share with you what the purpose of this data is so that you can contribute in the discussion of the data.

    I struggle with this and tend to believe I need to improve my persuasion skills and skills in educating my clients sometimes. If we think about it, there are tactics to handle each of the three items above…probably less for the third item.

    Going back to Avinash's comments around 'just because you can track everything doesn't mean that you should'….I completely agree. One of the things that I've been thinking about is the cost of tracking everything. If tracking of a non-key metric drops off, there is a cost to going back to determine why the tracking dropped off and getting your developers to add it back onto the site (especially true for event-driven tracking). If you have an event driven tag added to your QA environment, there is a cost of having your QA people check to make sure that the tag is still there and operating correctly with every release. So, I think that there can be a financial aspect brought into the argument of why it's bad business to track non-essential metrics, too.

    Alice Cooper's Stalker

  11. 13

    I feel like this is an approach I should take with my whole life, not just my web analytics.

  12. 14
    Israel Cefrin says

    @Ric Dragon
    Your phone message example was perfect. Teach clients that metrics must reveal actions to take it

  13. 15

    I tend to agree w/ #10 comment by Alice Cooper’s Stalker. Great comment!

  14. 16

    Awesome post, as usual.

    We tend to mix it up with "And then what?" or progressively shorter such as "Anndd???"

    It definitely is a good way to weed out the general curiosities or people who market in a vacuum and think their numbers are the most important to the entire company's well-being.

  15. 17

    Nicely put, Avinash! I completely agree with the approach. I am not sure I love the examples, though. Behind your narrative, I see some bias towards some metrics over others (perfectly natural :)). For example, you can imagine the following So What test for the Exit Pages metric:

    "These are the top exit pages on our website for the last month"

    So What?

    "These are pages where we lose potential customers who looked at our site but did not purchase our product"

    So What"

    "We need to do more customer research and to re-design our site to make it less confusing and show more value to potential customers so that they give us their money"

    Nice, actionable outcome after the third So What. In principle, this is a very reasonable approach. it is not however some valid and reliable "test" per se:)

  16. 18
    MktConsult says

    Good article & comments. I believe, by definition, that a KPI should be linked to a company's business goals and strategy — otherwise it's just a tracking metric.

    Therefore, at the end of the "So What"s, there should be a feasible, meaningful action that ties to the business goals/objectives/strategies and improves the organization's ability to achieve them.

  17. 19

    Great post – Data that cannot answer "the so what test" is just data not insight.

    Automated reporting makes it easy to generate and distribute Management Info without thinking !

    In 1963 Peter Drucker was extolling in the Harvard Business Review, the view "that each product, operation, and activity should be justified every two or three years"

    Maybe in 2010 this should be updated for every web analytics report should be justified every two or three months (if not more frequently)?

    What about getting recipients of MI to opt in on a regular basis ? Switching off supply and seeing who shouts may be a bit scary but it can also help analysts understand their user's requirements helping ensure that the analytical tail doesn't wag the business dog.

  18. 20

    Very good way of analysing a piece of data that you don't understand fully.

    It reminds me a lot of the 5 Whys techniques.

    You have a problem, ask 5 time why you have it to get to the root cause analysis.

    Never thought of applying the same technique to other data. :)

    Thank you! I'll be coming by this blog more often! :)

  19. 21

    Great post Avinash! It is critical to determine if a metric is relevant.

    We use a similar approach. We have define the purpose of a metric/measurement is to reduce uncertainty in a decision that has economic consequences. In deciding to use a metric, always answer the following:

    – What decision does it improve?
    – What uncertainty in the decision does the metric remove?
    – What is the economic value of that decision?

    The amount of uncertainty reduced is the quality of the metric. The economic impact is the value of the metric.

    Kind of like 3 “so whats”.

  20. 22
    Kristen says

    I have a similar question for those requesting the reports, "What do you want your visitors to DO?!" I especially like this one when people want to talk about the dreaded E word – Engagement. Too often, this is used as a cop-out because not enough time has been spent thinking through the success metrics.

    There are many things you want people to do – read an article, buy a widget, download a file, watch a video. Those are things I can measure. Engagement? Not so much.

  21. 23
    Ned Kumar says

    Great post Avinash – and advice worth following. Couple of my own thoughts on this:

    1) Don't kill a metric just because the answer to the "so-what" does not result in an immediate action. Sometimes you know what has to be done but because of organizational constraints (resources, structure, infrastructure etc.) you are not able to take immediate action on it. In those cases, I would (my opinion) use the so-whats to convince the HIPPOs and the organization to move in the right direction and continue to track the "non-actionable" metric.

    2) Sometimes the "so-what" implies removing a metric, but your boss or HIPPO wants to keep it. In these cases, you will have to learn to 'phase-out' the unwanted metric. By this I mean keep this metric for a while but slowly let folks see the value from the other more important metrics their direct influence on the bottom-line and/or customer experience.

    3) Wanting to report everything and anything when there is data available is obviously one of the human-weaknesses most of us have as Avinash points out. However, there is another aspect to the overload of data streaming in – sometimes you are so focused on using as much of the data coming in that you miss out on a really important signal. So it is good to also ensure your metrics incorporates all the key data elements (based on your company's/website's goals & objectives) in addition to pruning the metrics using the "so-whats" framework.

  22. 24
    Oded Greenberg says

    The "So What" test is a great tool to validate ideas and thoughts.

    Basically, the idea standing behind this article is the perception of QA.

  23. 25

    What a coincindence…

    A company wanted to buy our product… so we had a meeting. Unfortunately we could not provide them any metric they were interested in. We could only provide metrics for an operational activity (improvement, ab tests, user error/form ratio, etc.)

    High level metrics which made no sense for us, suddenly are more relevent.

    So even if they are not usefull **now**, I would strongly suggest to collect the most common ratios which can be used to make a high level picture of the product outside the product.

  24. 26

    Ric: I used to work for DHL in Saudi Arabia and did my fair share of answering customer service calls. Your wonderful comment took me back to that memorable time.

    Joe: One can't help it if the person writing your chq adamantly wants something. Arguing with them is futile.

    The optimal strategy is to give the baby with diabetes the candy it wants, while you figure out how also get the baby interested in things that are good for the baby. : )

    At the end of the day you have to make a choice. You give the Client/HiPPO what they want (what you know won't work) then you are not going to be adding value to the Client or the company. Two choices: 1. Take advantage of it as long as there is money to be made from this gig (as a Consultant or a in-house Analyst). 2. Find another client/job where you can add value with data.

    I am not trying to be glib with those choices. I know sometimes we have to choose 1 and some times 2.

    Justyn: You are wise beyond years to emphasize outcomes. There is no better way into a HiPPO's heart (and the company's increased bottom-line!).

    Alex: Aw come on! What is a man without his biases! :)

    I have to admit I have never found top exit pages to be actionable, ever (except in structured experiences like cart-checkout or lead submission etc). But I did end that section of the post by saying that your experience may be different, and you have provided all our blog readers with one such perfect example.

    Thanks for adding this use case. I appreciate that very much.

    Stef: Very very wise advice. In processes I have implemented I have put in place a check point for metrics to go through the process (last picture in the post) once a quarter. Pretty much the time frame you have recommended.

    I do not recommend your "wild" idea of turning things off (or asking for opt-in every x months). That is dangerous. They might not miss the reports!!! :) The first time I turned off the WebTrends server, early in my career, no one missed a single one of 200 reports. I cried for three days. Then I picked up myself and decided to try really really hard not do be in the data puking business if I can help it.

    Scout: I love this framework as well, three "so what's" but with a much more nuanced level of specificity for Analysis Ninjas. Fantastic!

    Ned: If we follow #1 my fear, perhaps unfounded, is that we will never be able to get rid of anything from our dashboards / things we should focus on.

    I was at a very large company last week (think: operates in 120 countries) and their dashboards and reports are full of "stuff". They call "stuff" KPI's but nothing has ever been changed based on "stuff".

    Now some of these are actually good, they simply can't action them due to exactly what you said in #1 (and so totally not the fault of the Analysts/Directors). My advice to them: Kill it. Go focus on what you can actually change, and that way you focus, perhaps rack up a few victories (even if small) and maybe one day you'll get ideal resources / IT support / whatever. Then these other "stuff" metrics will still be around for you to analyze.

    But until that day, move to what you can actual change (because life is too short! :)).

    Your advice #3 is something every Ninja should tattoo on an arm for permanent easy reference!


  25. 27
    Anonymous says

    Here is a great article about. The 5 Whys root cause analysis technique. It is one of the pillar to drive the change in an organisation: Treat the problem at the root, not just the symptoms. i discovered it when learning about Agile software development techniques like Extreme Programming and Scrum.
    As you'll see it is very similar :)

  26. 28

    To Joe and Alice's point (which was running through my mind as I read the post), what I've found works is two things:

    * Make sure it's virtually zero-cost to pull the non-valid KPIs

    * Use the power you have of formatting/presenting the data to make this information "available" but not part of the true KPIs

    So, you can lose the battle but win the war — include that data on a "second page" of the dashboard; call those measures "drilldown data," "supplemental measures," or "appendix."

    That seems to be a happy compromise where the HIPPO feels like he's "won" — he's still getting the data (but we know he's looking at the actionable data first). It's a good way to build trust over time so that the "so what" discussion actually is a collaborative exercise.

  27. 29
    Ned Kumar says

    I hear you loud and clear :-) – and agree with you and following my #1 could possibly result in clutter.

    The reason for my stating this explicitly was:

    a) I felt that the word "immediate" has to be evaluated in your context before you kill a metric. For example, say it is December and you have an action item but your bosses say there is no funding or resources till next fiscal (June). Yes – that is 6 months down the road but if you can elicit a commitment that you will be given the go-ahead then — you might want to keep your metric. Anyway, just a thought.

    b) The second reason for my statement was that I felt there should be a balance between "killing a metric" vs "measuring worthwhile/meaningful metrics (#3)". I do have a concern about folks 'killing' all the good stuff saying nothing can be done at this time and keeping metrics that are less relevant (exaggerating here to make a point).

    Having said that, I totally agree with you that dashboard clutter is one thing that should definitely be avoided. At the end of the day, I definitely cannot advice someone what they should keep or kill — like you always say, "Context is the King" :-)

  28. 30

    "Nary an insight in sight" is also happening because the bummer with analytics, it confronts us with reality, and our tendency to turn a blind eye. "Tomorrow …" It’s so comfortable inside our comfort zone. Been there done that.

    Reminds me of the lion asking the monkey, "Who is king of the jungle?" "You are!" the monkey replies. Next asks a snake, "Who is king of the jungle?" "You are!" the snake says. Next asks a grazing an elephant, "Who is king of the jungle?" No answer. Asks again, "Who is king of the jungle?!!" No answer. Bites elephant in the leg, "Answer my question!" Elephant turns around, grabs lion with trunk, swirls him around, slams him onto ground, for good measure kicks his ass. Shaken, getting back onto his feet, lion says, "You are only angry because you didn’t know the right answer!"

  29. 31

    Hey Avinash!
    Another great post. I think the main thing to realize here is that success is always a moving target. Constantly churning your KPI will help you find the metrics to move forward even faster.

    I've been doing the same with my Google Reader account later. I've been tweaking the advanced Twitter searches a little each week to further zero in on the types of conversations I need to listen to. More signal, less noise is the goal each week. It's been great for VoC!

  30. 32

    Whilst I agree with your "so what" test and often apply it, I'm not sure I agree with one of your examples.

    In this example you establish there is a drop in the completion of a given task from a previous time.

    Without hesitation you jump to a related but different report and note a fall in purchases. You then leap to some indirectly connected figures/reports regarding call to actions and sluggish adding to cart.

    This is all very top down and not at all obvious that you're going to spot why you're losing that 80K of revenue.

    Surely a bottom up approach in that example is much better that looking at your top down KPI approach.

    1) Have your call to actions changed? (they must have done as you state in your article that they are a likely cause, otherwise why would conversion rate dropped so dramatically). If they changed, and its something that likely to be significant enough to cause an 80K drop, why wasn't someone tasked with analysing what effect the change in call to action was having? Surely this is a governance issue – lets spend some of the 80K you're about to save on that.

    2) If there was report of slowness in response when people add to cart, why would that not be the first place you look for problems to assess what problem and cost of the problem might be. In particular, why would it be any different to last quarter? Probably because someone changed it, again this is bottom up approach rather than top down. Ok, lets spend some of the 150K we'll save by firing the lead developer and QA specialist and fix whatever we screwed up last month :-).

    You dont need a "so what" in these circumstances as you should be working bottom up – you have in place the checks and balances to respond to issues and potential new issues and can then evaluate the impact these have on the bigger picture.

    They are not "so what" rather "what's the impact?". Both are driven by value/revenue but one is driven by the cause of the problem and with the other you are driven by the problem but with little direction of the cause.

  31. 33

    Hi Avinash – I read your book – Web Analytics 2.0.

    I liked clarity in this blog post as to how to think about the business goals/outcomes while looking at the analytics.

    I have Q –

    1. how do I measure performance of repeat visitors vs. first time visitors on the web site funnel. In other words, how do I know from the analytics report, which conversions were from first time visitors, and which ones were from the repeat visitors?
    2. The goal/target for the first time visitor could be different compared to time when someone is visiting the web site for 5th time and is closer to the ultimate business conversion. Is there a way to differentiate that and measure using the analytics?



  32. 34


    The "Three Layers Of So What?" is very reminiscent of the "5 Whys" method used in Lean/Six Sigma. Your "Analytics Metrics Lifecycle Process" is also much like the DMAIC process. These are good things mind you, as more businesses should be using such practices however they're fashioned.

    Love your blog.


  33. 35

    Jason: In my first book I had covered DMAIC, and I am sure in creating the virtuous cycle that my learnings in Six Sigma (small as they are) helped influence the diagram you see in the post.

    Process is very important for any practice that wants to be mature. We should do all we can to take the good parts of Six Sigma and use that in our practices.

    I am afraid I had not heard of 5 Why's but some people have shared links about it and I think I rather like 5 Why's! : )


  34. 36

    Question – If a user searches for a company, clicks on the link on the results page but the site is down, does it count as a direct visit in Google Analytics?

  35. 37


    I really enjoyed this post. I think that in today's environment, less can definitely be more – especially when the "less" is well thought out, planned and has a clear vision driving it. Metrics for the sake of metrics is often a circular chase that clients and even consultants can find themselves in – taking a step back is so important in these situations.

    Often times, when implementing analytics solutions – you find that the people who really can drive the questions on "why" they want something in a solution take a step back and delegate to others who only know that they need to have it. That's their edict and boy – it's gonna get in there by hook or by crook. And that is usually when you find unused metrics post-implementation, metrics that "muddy" the waters and add to the complexity of a potentially simple story.

    Well put.

  36. 38

    I teach my telemarketing agents a similar tactic. Often (mostly!), our level of thinking skims along the surface and, without effort or prompting, does not progress any deeper. So, when a prospect mentions something interesting- but just in passing- we use parroting, paraphrasing, and feeling-feedback to elicit more in-depth thinking. The goal being, that after repeating this process two or three more times, it begins to dawn on the prospect that perhaps what appeared to be a minor irritation could have significant consequences down the road; that action is required now.

    Of course, you could also end up at a dead end (it helps to know the answer in advance!).

    You could apply this technique with a HIPPO who insists on an irrelevant metric. That way, it isn't you that is pushing a particular point of view, but the HIPPO that arrives at it through his own reasoning powers.

  37. 39

    Kiran: I think we find ourselves in the soup we do because for the longest time in the decision making business (think traditional BI) we could never get the key data we needed to make decisions. More metrics were a sign of success (well at least for BI teams).

    Now you slap the site catalyst tag on a site and you have more data than God intended you to have. Yet because of the evolution thus far we keep hording metrics as a badge of honor. Sad.

    I am sure the tide we turn. Enough hoarders have to fail first! :)

    Gene: Excellent suggestion on using this with HiPPO's Gene. Though perhaps we'll probably have to be more tactful than asking "why" or "so what". Asking a HiPPO so what a couple times might end up being be a career limiting move (even if it is absolutely positively the right thing to do)! : )

    Perhaps we can hide our inquiry under something impressive sounding, like saying "would you please help me create a fishbone diagram from the metric to the outcome you are helping the company drive." That would be cute! : )

    I mentioned somewhere in the comments on this post that I used to work for DHL in Saudi Arabia and I recall my trainer emphasizing the technique you highlighted in your comment. "Make sure you listen carefully and probe when something, no matter how small, raises a flag."

    You are absolutely teaching your agents a strategy that is quite effective.


  38. 40

    …absolutely agree that you wouldn't want to ask the HIPPO "so what?" three times in a row. LOL.

    I meant to say that the probing techniques I outlined could be used to explore why a HIPPO wanted a particular metric.

    But, I like your fishbone diagram idea even better!

  39. 41
    goodnewscowboy says

    @ElZarcho – re: "Oh, there are a few universal truths; but I don't think Avanish is qualified to discuss metaphysics. I'd suggest leaving such little comments out of discussions of web analytics. They add no color or insight to the article"

    I respectfully disagree. The comments you suggest Avinash leave out, are the very comments that give his writing the color, and dare I say it, the soul it possesses.

    He is writing about Web Analytics. Not theology. You need to be more discerning when digesting his writings. I'm just sayin'

    @Avinash – I've come late to the party, and everyone has said all there is to say above, but allow me to add this. I ALWAYS enjoy reading your posts. You have a very unique writing style that both entertains and enlightens.

  40. 42
    Michael Whitehouse says

    Strong stuff, as always. I must admit, however, that I feel some discomfort about the discussion of task completion and primary purpose (What? a former iPerceptions marketer uncomfortable with these metrics? Sacrilege!) I'm only jumping in because I've seen website owners fall into a trap by relying on a simplistic measurement of what is in fact a quite complex process. They optimize on the basis of negative feedback from putative buyers, and they find themselves frustrated when the needle doesn't move.

    The more I read up on cognitive neuroscience, the more I have doubts about the reductive power of primary purpose. There is a multiplicity and a fluidity to visitor intent that a single-response primary purpose question simply does not allow for. The truth is that I may visit a e-comm site to accomplish a whole basket of things (price quote, shipping quote, product visuals, ratings & reviews) and my primary purpose may vacillate throughout the session in response to what I see onsite–conversion funnels aren't always as neat as they look on paper. In reducing this complexity to a single question, there's the danger of acquiring precision (i.e. simple and reproducible primary purpose segments) at the expense of accuracy (i.e. the true, multimodal map of a visitor's intent).

  41. 43
    Michael Whitehouse says

    As a follow-up to my comment above, consider this very interesting quote from Les Binet of DDB, published in February's issue of Research Magazine:

    "What’s coming out of this newer view of science is that, first of all, people by and large are not nearly as rational as they’d like to believe they are; secondly they don’t understand their own behaviour very well; and thirdly they don’t behave in an individualist way – a lot of behaviour emerges from group behaviour. So any market research technique which is based on asking individuals how they behave about something might be a bit flawed.”

  42. 44

    Michael: I appreciate the feedback.

    Like all things in life one piece of data by itself won't provide all the context we need to make the kinds of decisions that are required. So we'll need to triangulate multiple sources (Multiplicity!) to ensure that we have all the context we need.

    Hopefully at the end of that we'll find that Visitors to our site are less complex than you are implying! And if they are…. oy vey!


  43. 45

    @Ric: I will remember your phone message example, thanks for sharing.

  44. 46

    Thanks for an excellent post! It seems to be a great way for evaluating existing metrics, and I'm going to apply it in my work.

    A question I have is, once I kill all the useless existing metrics, how do I then come up with new useful ones? Suppose I know the ultimate goal, for example "revenue". How do I go about defining metrics that would help highlighting what actions can be taken to increase revenue? Is it an "art", or there's a systematic approach one could take?


  45. 47

    Pavel: Happy to help….

    Here is a blog post that provides a framework (and rules) you can use to identify metrics important to your business (and yes they are usually unique to you):

    Web Metrics Demystified

    Here is a blog post that shares some of my favorite key performance indicators:

    Six Web Metrics / KPI's To Die For!

    Focus on why I choose them and less on what they actually are.

    Finally in this blog post I shared metrics for a spectrum of different kinds of businesses, you might find some inspiration:

    Excellent Analytics Tip #13: Measure Macro AND Micro Conversions.

    I have listed each post by priority, in case you are pressed for time.


  46. 48
    Sudhir S says

    Hi Avinash,

    From a PPC perspective, why not scale the KPI "conversion rate" to the mid or long tail keywords, for the following reason:

    Dynamic nature of the web – what is the top/head keyword(s) now could very well become low-performing / mid / long-tail keywords in the near future and vice-versa. This can be tested by;

    Taking a year's historical data (where available) of keyword level report. If it shows the above to be true, go ahead and

    – Segment campaigns by conversion (rate) buckets, say > 20%, 10 – 20%, 5 – 10% and < 5% and
    – Move keywords, which satisfy the above criteria, back & forth across campaigns. Yes this would be a tedious exercise, periodicity based on insights, but result in

    * keyword replacements or additions to the head………would also aid the long term SEO efforts by PPC to organic keyword replenishment cycle
    * running cost-effective campaigns and
    * positive ROI

    This would also give useful insights into;

    user search preferences
    frequency of occurence and
    time period(s) of occurence


    Wouldn't absolute no. of conversions be a better KPI than conversion rate……….a keyword delivering 2 conversions out of 3 clicks shows 67% conversion rate.
    If this is bumped up to say 10 conversions out of 20 clicks, would indicate better performance in absolute terms.


  47. 49

    Wow, what an excellent post. You have given me some excellent ideas for marketing and tracking the results better for my site: 1300 numbers.

  48. 50

    I realize that this is an older blog post and I'm late entering the conversation. Your blog borderline makes me want to cry because I don't really understand metrics and I definitely don't even know what my top exit pages are and yet there are google analytics installed on my blog. But the very refreshing part of this blog post is that it reassures me that it is OKAY not to be hung up in random metrics.

    So what? For me, my goal is to get people to read what I write. When I awake in the morning and look at my analytics I jump to that number of how long people are staying on my site. How many minutes are they there. I look at absolute unique visitors too, but a raw number about visitors isn't very interesting. If no one is staying long enough to read a whole blog post, well, that's just a disappointment.

    Thanks for the intriguing blog post. Oddly reassuring that it is okay to just say, "So what". I realize your audience is large sites with a pecuniary professional interests, and tiny little hobby bloggers, but still this was a great read for me.

  49. 51

    "The trend of repeat visitors for our website is up month to month."

    So what?

    "This is fantastic because it shows that we are a more sticky website now."

    not really… if the percentage of repeat visitors is up month to month it also means that we do not attract enough of the new ones… it's not as fantastic.

  50. 52

    Great post! I'm so tired of tracking metrics that essentially mean nothing.

Add your Perspective