Excellent Analytics Tip #14: Measuring Value of Ecommerce Sales Tools

centralAn Analysis Ninja, let's call him Philip Walford, asked a delightful question. Philip wanted to know if the impact of a faith based initiative in his company, product demo videos, could actually be measured using data.

Hurray!

Faith is good. Data is better. : )

[And before you flame me: know that I love my religion more than you love yours. Wait. That did not come out right. Let me rephrase that.]

In this thanksgiving week 2008 post I'll share Philip's question about how to identify value of video product demos on an ecommerce site, and my answer about involving customers.

Here's Philip. . . .

We are a large retailer with a lot of product on our site. In the past we have invested lots of dollars and time producing things like demo videos for our products, or adding other features and tools to our website to provide more information about a product. Our goal is to inspire customer confidence in their purchase (by giving them as much information is possible).

The question is, what are the KPIs of things like a demo video.

video product demos

My recommendation was to measure conversion rate for the segment that views the video. If conversion is higher then the videos are bringing value. Others in my company have presented the hypothesis only customers that are a lot more invested in buying the product are likely to click on the video link and hence "pre qualified", hence that segment would have had a higher conversion rate regardless.

I understand their perspective but I feel they are reading too much into the situation but I don't know how to argue this point. There are several directions we could go with this but I wanted to see if you could share some guidance on this issue.

My answer to Philip. . . .

This is a complex problem, more than might be apparent on the surface.

It is also an example where it can be easy to jump into bed with your web analytics tool to get satisfaction but you wake up in the morning feeling. . . . well. . . . less than satisfied.

tado my zune original But before we go there I have to give a ton of credit to Philip and his crew for being skeptical of reading too much into their own opinions or biases.

I firmly believe that people who work for a company rarely (never!) represent customers. They are too close to the company and too different.

Just because I work for Microsoft and use a Zune (yes I do!) does not mean I can be a effective customer representative of Microsoft Zune customers. Company employee opinions rarely reflect those of customers. Do please be aware of that.

So when looking to make decisions, look for data (quant or qual).

I'll present Philip with three solutions / options as he battles the challenge of figuring out if the investment of muchos dineros in creating product videos is worth it (besides the fact that these videos ooze sexiness!).

1) Use ClickTracks (Compute Contextual Influence)

There are two challenges with using clickstream data and the "typical" measure of conversion rate to determine success.

A] You might be looking at a "biased" segment (as challengers to Philip's recommendation mentioned). I.E. Only the highly motivated people.

B] By comparing all people who converted and viewed the video with those that converted and did not see the video you are not comparing fair segments. You are also lumping all other "convince our visitors to buy" tools into one large bucket. Tools like Comparison Charts and Product Screenshots and Product information and Customer Reviews and more.

clicktracks segmentation revenue analysis It is quite possible that those other tools might be getting people to convert at a much higher rate and by dumping them all together you are not being fair.

And of course you'll get a wrong read on conversion impact of the videos.

So even if you use your web analytics tools (your Google Analytics or Omniture or WebTrends or CoreMetrics or whatever) try to compute "contextual influence" (value of each feature in context of the others).

It is actually very hard (damn near impossible) to do this in all those tools (even for the Paid solutions, even after you plunk down half a million dollars for the mandatory Data Warehouse "add on").

ClickTracks is the only tool I know of that can do this out of the box, using its terribly named "funnel report". No data warehouse. No extra tags or variables or sprops or wt_&*#$. In fact not even much IT, I just need admin access to my tool (not site, web analytics tool).

Its easy to use. Create a hierarchy of your website. Add individual or groups of pages into each stage (notice I did not say step because you can jump steps here). Add an outcome (in my case say "Thanks for placing your order" page). Click Calculate.

Boom!

clicktracks funnel analysis

[You are not supposed to be able to read the analysis, sorry, privacy dictates that.]

What I want you to note is two things.

This is a site where each stage means a view of the site (and like a traditional funnel how many people get in, get out, move on etc).

Secondly note that each box (which represents a page/'s or a tool – videos, comparisons, reviews etc) has a different stage of blue.

What this lovely report does for you is compute "the influence"of each of those pages/tools in driving the ultimate outcome – purchase here. The darker the blue the more "influential" that piece of content. [Influence is defined by the existence of that piece of content in the visitor session, regardless of what path the visitor took, regardless of when the content was seen.]

Ain't that super sweet?

The analysis you see above is for a real ecommerce website. What it proved to us, delightfully, was that the product videos, we had created at a cost of over one hundred thousand dollars, yellow star above, was the least influential tool we had on our site.

The most influential, sexy pink star above, was a tool that had cost us $8 to produce – it was a page that compared different versions of the product (information that was handily available in the company).

We used actual customer behavior. We analyzed contextual segments. Ultimately it allowed us to put our precious few resources in the right area.

hippo Of course it is quite likely that everyone who came to the site and did not buy (convert) might have loved the videos and rushed to stores to buy our products (one HiPPO actually said that!). There is no way to prove that using just the web analytics data.

What we did is proved impact on online buyers.

As to the HiPPO. . . . read on. . . .

2) Use Surveys (Actively Collect VOC)

When in doubt (or confronted by a HiPPO, remember don't run) what better way to go then gather some Voice of Customer. Dare I say the voice of god? :")

Two things I have tried (of many!) that work a lot of the times. Each covers one unique bucket of visitors to your website.

A] Consider sending a simple post purchase email survey to customers who have purchased on your site and ask them for the key influencers of their purchase.

You could share with them the various tools you have on your site (product information, comparison tools, images, videos, customer reviews etc etc) and simply ask them to rank order them in order of importance.

Don't ask them to tell you how much they like them, or choose ones they like, they tend to pick all. :) Just ask them to rank order. Or use a tactic similar to that.

This tells you want works for those who buy.

For the 98% that will never convert on your website. . . .

surveys q and kampyle

B] Consider a onsite survey like 4Q (though 4Q can only be customized so much so perhaps you want to use either your own or one of the big daddy paid survey tools).

This will go to a small random sample of people who are on your site (who may or may not buy). You'll ask them three or four questions about why they were there (primary purpose) and then what tools/features of your website they liked (rank ordered if at all your survey company can do that).

That will give you what you want.

Since this can also be thought of as a page level problem, you can also use something passive, a page level survey / poll, like Kampyle on your product pages and ask people to quickly rate the various features. There is a Site Content feedback topic in Kampyle which you can customize.

Now you have the most important piece of data you need, your customer's. Few website owners / marketers / hippo's can argue with this. Leverage this advantage.

Finally one last option for you. . . . hopefully one you'll use before you write a chq for a hundred grand to create your videos. . . .

3) Use… wait for it….. Testing! (Measure Actual Customer Behavior)

I am sure this does not surprise you. Run a A/B or Multivariate Test and let your customers help inform you of the value of these features.

For 30% or 40% or whatever %, don't show the product demo videos and for the rest show the product demo videos and see the impact on the data. Boom (!) you have your answer, without any biased opinions.

a b testing tools and features

It is certainly going to take you a small amount of effort, get the Website Optimizer, talk to your IT folks, create version of the page with no product tour link etc.

But you are making a very expensive decision for your company are you not?

And here is the additional benefit of testing. You are free to use any kind of "conversion".

You can measure success as conversions (submit order).

You can measure success (of the test) as number of people abandoning from the product page.

You can measure success as the time people spend on the product page. [There is a very cool javascript code that does this with the Google Website Optimizer, it is especially helpful for rich media / flash sites. Without a doubt other vendors can do this as well, just ask.]

You can measure success through your survey tool if it is integrated (this is some extra work sadly, but for big bets I recommend it).

You can integrate your analytics tool with your testing tool (say Google Analytics with Website Optimizer) and use other metrics to measure success such as bounce rate or electric shocks etc :).

[For GA and GWO ROI has integration instructions .]

The bottomline is that you can define success and then let the customers tell you.

That's my answer to Philip.

Sounds exciting?

Am I the only one who thinks when you do this kind of analysis you are in a nearly orgasmic state?

Yes these methods are some small amount of work. But nothing in life worth having is easy. The tools might be free, but that does not eliminate your need to investing your time and effort! :)

And on the positive side with a recession looming people who involve customers in making decisions, rather than their opinions, will win big. The "guessers" will not win big. They might even win small. Or fail.

Plus if you do this you'll be a Analysis Ninja, not a Reporting Squirrel.

Ok now your turn.

Have you tried to analyze the features like Video Demo's on your website? Or perhaps other complex features you have launched? What works for you? What totally failed? In my recommendation to Philip, what did I overlook?

Please share your feedback, critique and hurray's.

Comments

  1. 1
    Steve says:

    "…measure conversion rate for the segment that views the video."

    Such a simple statement – so blindingly obvious once enunciated. A 750kW globe just lit up above my head. :-)

    My Thanks Avinash, Cheers!

    - Steve

  2. 2

    Thank you for your kind words about the ClickTracks Funnel Report, Avinash! I'm still proud of that feature. You're right about the name though, because it's really not a traditional Funnel Report (which are rather misleading, in my opinion). We tried to think of other names, but in the end the customers were clamouring for a Funnel Report, and it's hard to convince them with "we don't have one but we have this really cool Progression Report which is even better" or whatever.

  3. 3
    Simon Tu says:

    Awesome post Avinash! Your answer to Philip's question was very detailed, how do you find the time to reply to all the email you get? :+)

    It is impressive that you actively recommend competitors of companies you work for (Google and iPerceptions in this case). It adds to your credibility. Other web analytics bloggers should make note of that. Just because you work for someone does not mean the rest of the world does not exist.

    Simon.

  4. 4
    Dan says:

    I agree with Simon about how its refreshing that you regularly suggest competitors to ga and 4q. But a Zune? I dunno Avinash, I think that might subtract a little from your street cred! :D

  5. 5
    SFGreg says:

    I am a failure!

    We had a new video product demo. I said how great it would be to do an A/B test with Google's Website Optimizer.

    I did back flips to get this A/B test live.

    We started seeing results in the A/B report, but after a few weeks things flattened out. A and B were virtually identical.

    Turns out the people who made the video just assumed it would help sales, and while being nice about the A/B test in meetings, they secretly gave the URL for the video to the sales team, so within a few weeks there was no "B group who did not see the video option," because the sales team was emailing it to everyone.

    Ahhhhhhhhhhhhhhh!

    Only a few days after the test went live, people were asking when it would be finished so ALL site visitors could see it, not just half. They thought I was holding up progress.

    They didn't like me bringing up the possibility that the video might HURT sales, so my orgasmic A/B test was a threat and an insult.

    They were, however, very interested in knowing if I could use analytics to find out how many "hits" their video got each day.

    Ahhhhhhhhhhhhhhhhhhh!

    A troubling part of marketing is the tendency some people have to work very hard to make each project really, really great. I just want to throw something up that isn't an embarrassment and see if anybody clicks, and after that I want to try various things, including their fancy stuff it they like, so we can optimize over time.

    When people have no sense of analytics they want to put in a huge effort to get it "perfect" and that's it – never do anything else. They have no sense of how it could be optimized over time.

    When I see people working hard to produce a great product, I'm starting to think it can be a bad sign.

    Once they have their perfect, amazing marketing asset, an offer to test whether it helps or hurts is deeply troubling. They can't conceive of the possibility that their product demo video could have a negative impact on sales. They dread the idea that indisputable data, complete with charts and everything, might show just that.

    "Am I the only one who thinks when you do this kind of analysis you are in a nearly orgasmic state?"

    For some people the offer of an A/B test is seen as a challenge to their skills, an implication they don't know what they are doing and something that potentially could reveal that their snazzy project is damaging the business.

    But don't worry. We'll do another A/B test. Next time will be the charm.

  6. 6
    Ryan Kelly says:

    Avinash – I've seen several people starting to use Kampyle now, including LunaMetrics. I've been using Get Satisfaction. Do you prefer one over the other? Why? I love the "report a problem", "ask a question", "share an idea", "give praise". Besides, I'm not having great success with 4Q with clients in terms of getting them to see the value in the data, and not the perceived "annoyance" of it. If they get one complaint from a customer, they want to turn it off, regardless of the juicy data we might be getting. So I've had to find alternatives, and these seem to work, although I don't get purpose and task completion.

  7. 7
    Kris Groulx says:

    Another great post Avinash…

    I used to use this sort of A/B testing with internal banner campaigns. It was so much fun battling the superior creative minds in Marketing… did using your special font for 1 trillion dollars make people click the banner more than the exact same banner with FREE Arial font… NOPE! :)

    (ok sometimes their creative won, but only by a 1 or 2%)

    Kris

  8. 8

    SFGreg: Getting the test results polluted is a deeply sub optimal experience! Well for the next time they'll know.

    For the latter part of your comment I completely agree with both your sentiments.

    First, that the right approach should be to launch something good enough (before company employees love it too much or have put too much work into it) and then iterate (or fail fast).

    Second, that people can often see these kinds of expensive productions (like demo videos or the sexy new flash application) as their baby and then have no capacity to judge if their baby is ugly or pretty.

    Testing does require a cultural shift (I shared some tricks here: Tips for creating a culture of testing (tips 1, 2, 3))

    Keep up the good fight, and good luck!

    Ryan: The difference is that one is a Site Level survey (4Q) and the other is a Page Level survey (Kampyle). I have covered the difference and when to use each in this post:

    Eight Tips For Choosing An Online Survey Provider

    In a nutshell Site Level surveys are great at understanding the complete experience ("session level") and motivations. Page Level surveys are great at understanding pieces of the experience ("hit level") due to the nature of the invitation process.

    An example I have used for using Site Level surveys is say a ecommerce site or a social media site. For Page Level a great one is a support site (where you want answers related to faq's) or in this post to know if a feature on a page is worth it.

    Each is great at what it does, but they are sub optimal in doing the other's job.

    The wonderful thing is that both 4Q and Kampyle are great solutions and free! Which means if you are not happy with one, of have doubts about its capability, then switch it off. Try the other one and learn and using the practical experience to inform your opinion.

    A/B testing. :)

    -Avinash.

    PS: Dan: I did note a drop in my street cred right after you submitted your comment. :) Amazingly my wife said the same thing when I got my Zune! But sometimes you go for function over form!!

  9. 9

    Ahhh….ClickTracks funnel report. Indeed, poorly named. I recall we played with other names such as 'flow report'. Rejected for good reasons :)

  10. 10
    Eivind Savio says:

    I don't use ClickTracks, so I may have missed something here… Anyway, here are some kind of technical ClickTracks questions from me.

    ClickTracks is mainly a log based webanalytics tool, but you can combine it with javascript tags. Right?

    In your example, was the video hosted on the same domain, or were they using a third party video solution? Did they use the log file solution, or the javascript solution? Can ClickTracks actually track that somebody viewed the video "out of the box" without tagging the video? Even a third party video solution? How do ClickTracks know that I have viewed/consumed the video, and not just only loaded the video or visited the page with the video on (without any tagging of the video)? Time on page can perhaps be used, but you still don't know if I actually have viewed the video.

  11. 11

    Elvind: ClickTracks is available both as a logs based or a tag based solution. CT can also be purchased as a ASP (hosted) or In House (installed) versions.

    For the purpose of the "funnel" report influence is computed for each page. In as much no encoding of the video player is involved. It also means that interactions with the video itself are not tracked.

    The assumption, 90% safe IMHO, is that you clicked on the page to view the video (of course a small percent might not, but that goes for any page).

    Clickstream data will always have limits, it will never answer things to full confidence (no matter how much coding we do). That's why my recommendation includes points #2 and #3, ensuring the customer VOC is captured.

    Hope this helps.

    Avinash.

  12. 12
    Tim says:

    A friend/client of mine is hitting a low period. I've suggeted we do an A/B test on several of his products that aren't doing so hot. No matter how much you talk about the A/B tests I learn something. I'm going to further implement your ideas and we're going to have him selling more golf clubs in January than he did last June.

  13. 13
    Eivind Savio says:

    Thanks Avinash for your answer.
    The recommendation of #2 and #3 is understandable and something I would shout hurray for, but I still scratch my head about point #1.

    Is really ClickTracks the only software that could have given a reliable answer to this in a easy way? Couldn't for example $-index be a method for getting this answer?

    And since we are into video, I think Online Video Analytics – KPIs is something to think about.

  14. 14
    Ned Kumar says:

    Thanks Avinash for your insights. Your statement, "This is a complex problem, more than might be apparent on the surface …" really resonated as I have scratched my head on this 'chicken or the egg' issue in the past — are the demos etc. created to motivate "low-value" (not belittling the customer here, just a label for those that are not actively using the products/services) customers into using the product/service or is it a loyalty tool to retain and maintain customers already active. I have tried to understand this first before going into measurement per se.

    As always thanks for a detailed reply — there is always a learning in there for me.:-)

  15. 15
    Jake S says:

    Thanks for the great post, Avinash. I'm curious how you would approach testing on a decision where the benefits might not be materialize for a while (or where the benefits are created by customers themselves).

    As an example, let's say I was thinking about adding a forum (or wiki or customer reviews or other user generated content) where prospects could post questions and view problems that had been resolved. The forum would not have much value in the beginning while there are few answered questions. So I imagine an A/B test would take a long time to show any effect from the forum. In fact, I think the A/B test would delay the value creation from the forum because fewer users would see it and potentially post.

    Do you have thoughts on how to perform a test on this type of content?

  16. 16

    Jake: This is a more complex decision.

    For starters it is not a "real" a/b test, more like A is we had nothing, B is that we now have the forum. Typically it would also not be prudent to show the link to the forum to half the people and not to the other half. :)

    I would think of this less as a A/B test when trying to identify value.

    It would be more like a classic value proposition exercise. What's the cost of doing this (probably two dollars because so much good forum software is free)? Software plus people overhead. What is the benefit of doing this? Measurements would be:

    * Improved customer satisfaction.

    * Lowered support costs.

    * A unquantifiable improvement in people's perception of the company (or maybe quantified via a survey and the "likelihood to recommend company" question).

    * Various interaction metrics. (How many people contribute vs the number of customers (or the number of Unique Visitors)).

    * Growth in posts / pages / whatever.

    * Sales on the eCom site from referrals in the forum (notice I put this last).

    Things like that.

    You give it a couple months. Go through this exercise. Then kill or invest more based on the cost benefit analysis.

    Hope this helps a small bit.

    -Avinash.

  17. 17
    Jake S says:

    Good points. Thanks for the response Avinash

  18. 18
    jesse says:

    great post avinash!

    i had the privilege of seeing you speak recently and i'm a new reader, forgive me if this info exists in a previous post. but i've noticed that much of your material (including this post) references multiple analytics packages such as clicktracks, omniture, etc. obviously GA is free and you are the 'evangelist', so you recommend GA to all.

    what are the basic thoughts on using multiple packages at once – and pros/cons of having multiple strings of tracking code for conversions and full blown analytics packages? (any performance setbacks?)

    is there a recommended max? for example is having 2 or 3 recommended whereas more than 3 overkill?

    i ask because i have access to clicktracks but am not currently using it because we have about 5 other tracking tags going at the moment, GA included. feels like overkill… tough to get my techs to keep them all up to date… easy to under utilize all the data.

    thanks for any input.

  19. 19
    medyum says:

    A friend/client of mine is hitting a low period. I’ve suggeted we do an A/B test on several of his products that aren’t doing so hot. No matter how much you talk about the A/B tests I learn something. I’m going to further implement your ideas and we’re going to have him selling more golf clubs in January than he did last June.

  20. 20
    Mike says:

    Great post! Encouraging user contribution can be one of the best ways to uncover problems with your ecommerce process. I like using http://www.uservoice.com and Fast Feedback, although new seems pretty flexible. http://www.fast-feedback.com

  21. 21
    Omar says:

    Hey Avinash, i really would love an answer on this: ¿We should use "Funnel Reports" in GA to get the %convertion rate of a E-Commerce (transactions) website (with 3 different payment methods)? They do "benchmarking" with a store of the same company in other country. And they go: I have a Convertion Rate of 10% and you only have a 5% ha -ha- ha…. (no commets).

    1) We have different payment methods, but the same thankyoupage.com. One method includes a registration form, other a login page, a not "passwords" is required page. In ALL the funnels we have the same numbers for the goals, and for the same "stages". But we want to measure wich method is more effective. ¿We have to built a diferrent thankyoupage for every method? ¿A different page for all the "Stages" for every method? Right now they share all the "stages".

    Thank you, so much man! I hope you can answer this one! A big hug, my friend!

    • 22

      Omar: There are different ways to solve this problem. You could use event tracking (or hit level custom variables) to distinguish the experience. You can also make the page views unique by using url parameters so that you can create the three funnels.

      My recommendation would be to work with a GACP. They can go through the requirements and validate and recommend the right path. You'll find a list here: http://www.bit.ly/gaac

      -Avinash.

  22. 23
    Andre says:

    Hi Avinash,

    Great post as always. If its ok, i would really like to ask you this:

    For what we should use a Funnels Visualization? I mean: a) They are all "open funnels", anyone can join in any step. b) Theyt dont register all the "transactions" in a thank you page. They track only unique page views.

    I don't see the potential for this report, only for the "influence" in the process. The problem is that analysts i know recommend it so bad, so clients claim for them. I rather prefer to use the conversion rate of the ECommerce panel to see if we are doing great. Let me know what do you think, Avinash.

    Kind Regards,

    André.

Trackbacks

  1. [...] Do you know how to measure the value of ecommerce tools like video? Analytics guru Avinash Kaushik shares tips on how to do it intelligently. [...]

Add your Perspective

*