Find You Web Analytics Soul Mate (How To Run An Effective Tool Pilot)

sun You need: The perfect web analytics vendor.

Process so far……

    1. Read blogs: check.
    2. Picked six vendors: check.
    3. Sent request for proposals (RFP's): check.
    4. Survived six exhaustive vendors pitches: check.
    5. Made client reference calls to get the real truth: check.
    6. Picked two or three vendors to run live pilots: check.

Now what?

Steve Medcraft and Avinash Kaushik to the rescue!

In this post you'll learn all the things you need to cover to run a flawless web analytics tool pilot on your website and make the best possible choice. A bit from Steve and a bit from me.

It is important to realize at the outset that the average time from implementing a vendor to realizing you made a mistake to making the decision to switch to getting the new vendor implemented in is approximately two years. You are making a critical choice for your company and you could lose a lot of time if you make the wrong one.

With that out of the way……..

Vendor pilots are usually staged for success, yes staged. It is not that there are any sub optimal intentions at play. Every sales person wants a deal, they are most likely compensated on a quota and each vendor wants to look good. It does not matter if you are selling the most expensive tool or the cheapest one or a free one (yes even free ones have pilots!).

A couple of weeks ago Steve sent a email outlining how he was conducting a pilot between two vendors on his live site for six weeks. He wanted to know if all his bases were covered. We exchanged a couple of long email sand at the end on my request he was kind enough to permit me to share the knowledge with you all here.

We'll start things with key areas from Steve (you'll agree he had indeed thought of everything!) and end with my feedback / advice on some subtle things to be aware of consider (mine are all "hidden" type things!).

liberty of the seas: water water everywhere

Steve's Web Analytics Tool Evaluation Criteria:

By way of context two tools are in this pilot on a extremely large content publisher (huge number of page impressions per month).

At a high level, the key areas we want to be able evaluate during the pilot are:

* Usability:

Accessibility/intuitiveness and understand whether our target audiences (business, data analyst and I.T.) will actually be able to use and customize the toolset and reporting themselves versus whether we have to get dedicated resources to be able to create the necessary reporting and dashboards on their behalf. Get a feel for the extent of training needed.

* Functionality:

Testing the functionality in realistic business situations (does it really do what it said on the tin); evaluate the standard out-of-the-box reports / features and page tagging initially versus customization and extension to the data collection to meet our needs (likely to be running a handful of scenarios with both vendors). Ascertain what is of actual value to the business etc.

* Technical:

Understand the effort to implement, configure, customize – get a feel for the actual implementation plan. Determine any unexpected overheads on our environment. Test potential interoperability with our other systems/data sources. Attempt to identify any limitations with each solution. Understand where tags can be expanded/customized/integrated etc.

* Response:

Both of the ASP software solutions (performance, ability to handle the volumes, availability of the reports/data, benchmarking exercise) and of each vendor themselves (1st line support, ability to step up to our specific needs, documentation and customization – we are hoping to put together some scenarios for each vendor to test this area)

* Total Cost of Ownership:

Identify any additional costs we would incur implementing for our business not obvious in the vendor's proposal (additional administration, licenses etc.).

Any other areas we should try to focus on?

Agreed that above is extremely thorough? Print it and save it for your future web analytics tool's pilot.

labadee: there is a lot to do here!

AK's Web Analytics Tool Evaluation "Tips From A Tough Life":

In completely random order (with a bit of clean up and some additions)….

* Time:

Make sure you tell them that the six weeks for the pilot start after you confirm that the solution (tag) is implemented on the site (and not from when they send you the code). You would need atleast six weeks of it fully running to get a feel of if it is right for you.

* Fairness:

As much as possible try to do the exact same things in each vendor's tool. This seems obvious but every tool is strong in its own unique way hence it is easy to do end up doing different things in each. That would not be fair to either vendor.

* Data Sampling:

You won't really get a feel of its ability to deal with massive amounts of data, because you'll only have six weeks worth of data. But still ask each vendor what kind of data sampling they do (there is a good kind and there is a bad kind) to make queries go faster. See if the two vendors do it in the same way (if they say no sampling is required don't believe it, at your size it will be needed sooner rather than later).

labadee-fire eater* Segment like crazy!

This is not as easy as one might imagine in any tool. Segment by customer behavior (x pages or y amount of time or saw these pages but not that or let's carve out everyone who only sees this and then what do they do etc etc) and source (referring urls, direct marketing campaigns, affiliates etc). Segmentation will show you the true colors of any tool.

Oh and remember to ask what you have to do upfront to be able to segment the data later (and what if you forget to do the upfront work).

* Search analytics questions.

Ask each vendor how it identifies organic traffic from search engines (this is a trick question). Ask each what would be required to track your Pay Per Click / Search Engine Marketing campaigns (this in of itself can be a huge pain, with all the work required so go with the option of least pain), or to be able to import your keyword bidding and search spend data.

* Site content grouping.

Test how easy it is for you to group content of your site in each tool and what happens when your pre defined content groups change. A content group for the New York Times could be Editorials, Features, International News, Paid Content etc. How much work will that take? Can you go back and redo history (say if you forget or want to create a different grouping in history to see how things might have been)?

* Bring on the Interns (or the VP's!).

liberty of the seas-H2O zone

Make sure you have atleast a couple of complete and utter newbies in the user pool and a few smarty pants analytics experts (if you have 'em), you want to ensure different personas are hitting the tool. The newbies (interns or VP's) are really good for knowing if you have a tool that will power data democracy or.

* Test support quality.

When you first run into a problem or can't solve anything resist the temptation to call your account rep. Try to find help in the tool, on the vendor's website, via email tech support or on user forums. During a pilot / trail you will get far superior levels of support. After you make the purchase for some vendors, not all, this goes down quite a bit. Might as well test how the reality works for each because you'll use help in the tool or forums or email tech support or the 800 number.

labadee: damini's sand castle* Reconcile the numbers (they won't but it's fun!).

Compare the numbers and then ask the vendors to explain the discrepancy. They won't tie at all and it drives people nuts (self included). But their reactions and how they explain the deltas will tell you a lot. Make sure you give them specific data, for specific time frames (this will greatly appreciated by the vendors) and then ask for a explanation. (Remember I am on the record saying that data quality sucks on the web!)

[I apologize to all the vendors, and my friends with vendors. This is a dreaded question and there could be a million reasons for the delta. But we should be able to atleast explain why, with some sensible reasons.]

* Check the daily / normal stuff.

I am sure you have this already but check how easy it is to create customized dashboards and customized versions of the same reports for different business units or add computed metrics. I don't think this will be a issue with either of the two tools you have selected but nonetheless the process of doing each of these tasks will be interesting.

* Sweat the TOC.

I am glad you are doing the TOC, this will highlight hidden gems.

What do you all think?

I am sure the first thought is, whoa! Too much!! I agree. Between the two lists there is a lot for you to think about. Two important things to realize:

1) If you are going through all the trouble of doing a pilot you are probably going to fork over a ton of money. Perhaps hundred thousand dollars just to the vendor on the low end (and Lot more on highest end). That is not accounting to the multi hundred thousand investment from your company (into people, organization changes, processes etc). Can't hurt to be a bit more diligent.

2) Pick what you like from the list above, now everything is on the table. And if you do choose everything above, and you probably won't, give each element a weight so that you can actually get through your most important criteria during the pilot!

And remember to have fun, it is a blast to do pilots. Especially if you are one leading 'em!!

Please share your thoughts via comments. What have you learned in running a pilot for any tool? Did you forget to cover something you regretted? Did something work particularly well for you? Have horror stories to share (minus vendor names)? Would you like to boo something you read above?

PS: Am on vacation this week, sailing the Royal Caribbean's Liberty of the Seas. Normally I have contextual pictures in the post but on this ship internet access is fifty cents a minute, and it would cost me a lot of money of find 'em for this post. Even without that you can imagine how much this post cost!

PS: Yes, yes, vacation is not for blogging. Jennie does let me get away with a lot! :)

liberty of the seas: labadee

Comments

  1. 1

    Hi Avinash,

    I can imagine how much this post costs for you. But for us, readers, there is no way to imagine, it is priceless ;-)

    Have a great trip and congratulations for the wonderful pictures.

  2. 2
    Rahul Deshmukh says

    Avinash,
    Great post and thanks for sharing your thoughts. Couple of items I would like to add:
    1) Define success criteria from the get go. This is important for everyone involved in the process including the vendors.
    2) A number of pilots focus on current functionality/reporting/capabilities, which is great – but you are truly investing for the future needs. Have some test cases for future needs and test out.
    3) Develop a checklist of items and share it with all users. Let me fill out how the tool worked and ask for comments.
    4) If possible, pressure test by sending more traffic volume based on extrapolated future traffic growth.
    5) Check regular usage reports to ensure the tool is being tested and used. If possible get a weekly usage report. This can make you change mind to kill the Pilot if there is no usage :–(
    6) Last but not the least, Clickstream data is huge (definitely an understatement for sites with high volume)- make sure you understand and work through the entire data flow process. This is huge for multi channel integration.

    Sorry for the long post….

    Enjoy your vacation!

  3. 3
    Bryan Cristina says

    It's great to know that in this industry of not knowing too many people personally who do it, that I can get some validation for what I do and see I'm right on with some of your points. And then the ones I'm not that are unfamiliar to me, like data sampling, are definitely great to be made aware of.

  4. 4

    precise and Priceless!, that's your post to us readers :-) Amazing post with insights..

    Have a great vacation!
    Anil

  5. 5

    Clear and concise points as well as great pictures (I'd trade my 12 MBit cable for a place on the ship even with the exuberant Internet access prices).
    Enjoy the vacations – you deserved it after publishing your excellent book.
    Alex

  6. 6

    Excellent advice and I love the visuals! Makes me look forward to a summer vacation!

  7. 7

    Whenever someone asks me how they can decide upon an analytics vendor, I'll point them here and save myself a few hours!

    Avinash Kaushik, increasing productivity one blog entry at a time. :)

  8. 8
    Craig Danuloff says

    Avinash – Sorry to post this here, but your eMailbox is full! I'll consider that a victory for you vacation…

  9. 9

    I got tired of being around of different tools and different sites looking for consolidated information about what the status of my site. So I decided to make a report that looks like this.

    This is a basic report not an automatic generated work. This is done manually with specialized tools.

    Mario Ruiz
    http://www.oursheet.com

  10. 10

    First of all and as always – great post! I will most certainly try to get any leads (on the Tool Pilot stage) to read this. I have seen multiple versions of this and most leads/clients have a list prepared to some extent, not at this level though. :-)

    I would like to add a comment (putting on my Vendor cap) – that my experience over the last couple of years, is that a pilot period of 6 weeks is very difficult to realize nowadays – simply because deploying the features of the tools that need the most “testing” takes time and resources from other groups than e.g. marketing and at the same time rolling out changes to the web properties are on somewhat fixed schedules.

    I (we) typically recommend a TOOL Pilot phase on 90 days – And that would be 90 days of data collection. We also recommend that clients/leads spend more time on this that on the RFP selection process. As in, if in doubt, include another tool into the pilot.

    N.B.
    … And I am writing this on Sentosa Beach in Singapore. :-)

    Dennis

    Dennis R. Mortensen, COO at IndexTools
    My Web Analytics Blog

  11. 11

    Avanish – Great post on breaking down the considerations that go into researching a web analytics system. During my current research I found your blog (as well as other blogs linked from yours) to be an invaluable resource. Your information, coupled with a 2007 comparative test found here (http://www.stonetemple.com/articles/analytics-report-may-2007.shtml), led me to a confident decision for an analytics system that will meet our needs.

    Thanks!

  12. 12

    Hi Avinash

    You were GREAT at the summit today!

    This information on this site makes a newcomer in the web analytics scene not feel intimidated.

    Thanks!

Trackbacks

  1. Book Review: Web Analytics An Hour A Day…

    We here at the ISG media & analytics team have been anxiously awaiting the publication of Avinash Kaushik's book, Web Analytics: An Hour A Day, since we first "met" Avinash at a webinar back in February. Avinash's passion fo…

  2. […] From Avinash Kaushik, How to pick your perfect web analytics vendor […]

  3. […] At the end of the day web analytics is a TOOL. While I am NOT one of those people who looks at a car as just a way to get from point A to point B, I know a lot of folks who do. And to those folks the bells and whistles of the New BMW 335 just don’t resonate when they are trying to get from point A to Point B. I challenge more of you to stop thinking that the company that posts the most whitepapers and has the most user conferences is the BEST SOLUTION for your web analytics needs. Start your web analytics search by getting a better grasp on what you need to know about your site to make good decisions then start seeking out the right match based on your needs. Google Analytics is not for everyone, but shouldn’t you make that judgment based on what your needs are and not on some marketing brochure or whitepaper? […]

Add your Perspective

*