You need: The perfect web analytics vendor.
Process so far……
1. Read blogs: check.
2. Picked six vendors: check.
3. Sent request for proposals (RFP's): check.
4. Survived six exhaustive vendors pitches: check.
5. Made client reference calls to get the real truth: check.
6. Picked two or three vendors to run live pilots: check.
Steve Medcraft and Avinash Kaushik to the rescue!
In this post you'll learn all the things you need to cover to run a flawless web analytics tool pilot on your website and make the best possible choice. A bit from Steve and a bit from me.
It is important to realize at the outset that the average time from implementing a vendor to realizing you made a mistake to making the decision to switch to getting the new vendor implemented in is approximately two years. You are making a critical choice for your company and you could lose a lot of time if you make the wrong one.
With that out of the way……..
Vendor pilots are usually staged for success, yes staged. It is not that there are any sub optimal intentions at play. Every sales person wants a deal, they are most likely compensated on a quota and each vendor wants to look good. It does not matter if you are selling the most expensive tool or the cheapest one or a free one (yes even free ones have pilots!).
A couple of weeks ago Steve sent a email outlining how he was conducting a pilot between two vendors on his live site for six weeks. He wanted to know if all his bases were covered. We exchanged a couple of long email sand at the end on my request he was kind enough to permit me to share the knowledge with you all here.
We'll start things with key areas from Steve (you'll agree he had indeed thought of everything!) and end with my feedback / advice on some subtle things to be aware of consider (mine are all "hidden" type things!).
Steve's Web Analytics Tool Evaluation Criteria:
By way of context two tools are in this pilot on a extremely large content publisher (huge number of page impressions per month).
At a high level, the key areas we want to be able evaluate during the pilot are:
Accessibility/intuitiveness and understand whether our target audiences (business, data analyst and I.T.) will actually be able to use and customize the toolset and reporting themselves versus whether we have to get dedicated resources to be able to create the necessary reporting and dashboards on their behalf. Get a feel for the extent of training needed.
Testing the functionality in realistic business situations (does it really do what it said on the tin); evaluate the standard out-of-the-box reports / features and page tagging initially versus customization and extension to the data collection to meet our needs (likely to be running a handful of scenarios with both vendors). Ascertain what is of actual value to the business etc.
Understand the effort to implement, configure, customize – get a feel for the actual implementation plan. Determine any unexpected overheads on our environment. Test potential interoperability with our other systems/data sources. Attempt to identify any limitations with each solution. Understand where tags can be expanded/customized/integrated etc.
Both of the ASP software solutions (performance, ability to handle the volumes, availability of the reports/data, benchmarking exercise) and of each vendor themselves (1st line support, ability to step up to our specific needs, documentation and customization – we are hoping to put together some scenarios for each vendor to test this area)
* Total Cost of Ownership:
Identify any additional costs we would incur implementing for our business not obvious in the vendor's proposal (additional administration, licenses etc.).
Any other areas we should try to focus on?
Agreed that above is extremely thorough? Print it and save it for your future web analytics tool's pilot.
AK's Web Analytics Tool Evaluation "Tips From A Tough Life":
In completely random order (with a bit of clean up and some additions)….
Make sure you tell them that the six weeks for the pilot start after you confirm that the solution (tag) is implemented on the site (and not from when they send you the code). You would need atleast six weeks of it fully running to get a feel of if it is right for you.
As much as possible try to do the exact same things in each vendor's tool. This seems obvious but every tool is strong in its own unique way hence it is easy to do end up doing different things in each. That would not be fair to either vendor.
* Data Sampling:
You won't really get a feel of its ability to deal with massive amounts of data, because you'll only have six weeks worth of data. But still ask each vendor what kind of data sampling they do (there is a good kind and there is a bad kind) to make queries go faster. See if the two vendors do it in the same way (if they say no sampling is required don't believe it, at your size it will be needed sooner rather than later).
This is not as easy as one might imagine in any tool. Segment by customer behavior (x pages or y amount of time or saw these pages but not that or let's carve out everyone who only sees this and then what do they do etc etc) and source (referring urls, direct marketing campaigns, affiliates etc). Segmentation will show you the true colors of any tool.
Oh and remember to ask what you have to do upfront to be able to segment the data later (and what if you forget to do the upfront work).
* Search analytics questions.
Ask each vendor how it identifies organic traffic from search engines (this is a trick question). Ask each what would be required to track your Pay Per Click / Search Engine Marketing campaigns (this in of itself can be a huge pain, with all the work required so go with the option of least pain), or to be able to import your keyword bidding and search spend data.
* Site content grouping.
Test how easy it is for you to group content of your site in each tool and what happens when your pre defined content groups change. A content group for the New York Times could be Editorials, Features, International News, Paid Content etc. How much work will that take? Can you go back and redo history (say if you forget or want to create a different grouping in history to see how things might have been)?
* Bring on the Interns (or the VP's!).
Make sure you have atleast a couple of complete and utter newbies in the user pool and a few smarty pants analytics experts (if you have 'em), you want to ensure different personas are hitting the tool. The newbies (interns or VP's) are really good for knowing if you have a tool that will power data democracy or.
* Test support quality.
When you first run into a problem or can't solve anything resist the temptation to call your account rep. Try to find help in the tool, on the vendor's website, via email tech support or on user forums. During a pilot / trail you will get far superior levels of support. After you make the purchase for some vendors, not all, this goes down quite a bit. Might as well test how the reality works for each because you'll use help in the tool or forums or email tech support or the 800 number.
Compare the numbers and then ask the vendors to explain the discrepancy. They won't tie at all and it drives people nuts (self included). But their reactions and how they explain the deltas will tell you a lot. Make sure you give them specific data, for specific time frames (this will greatly appreciated by the vendors) and then ask for a explanation. (Remember I am on the record saying that data quality sucks on the web!)
[I apologize to all the vendors, and my friends with vendors. This is a dreaded question and there could be a million reasons for the delta. But we should be able to atleast explain why, with some sensible reasons.]
* Check the daily / normal stuff.
I am sure you have this already but check how easy it is to create customized dashboards and customized versions of the same reports for different business units or add computed metrics. I don't think this will be a issue with either of the two tools you have selected but nonetheless the process of doing each of these tasks will be interesting.
* Sweat the TOC.
I am glad you are doing the TOC, this will highlight hidden gems.
What do you all think?
I am sure the first thought is, whoa! Too much!! I agree. Between the two lists there is a lot for you to think about. Two important things to realize:
1) If you are going through all the trouble of doing a pilot you are probably going to fork over a ton of money. Perhaps hundred thousand dollars just to the vendor on the low end (and Lot more on highest end). That is not accounting to the multi hundred thousand investment from your company (into people, organization changes, processes etc). Can't hurt to be a bit more diligent.
2) Pick what you like from the list above, now everything is on the table. And if you do choose everything above, and you probably won't, give each element a weight so that you can actually get through your most important criteria during the pilot!
And remember to have fun, it is a blast to do pilots. Especially if you are one leading 'em!!
Please share your thoughts via comments. What have you learned in running a pilot for any tool? Did you forget to cover something you regretted? Did something work particularly well for you? Have horror stories to share (minus vendor names)? Would you like to boo something you read above?
PS: Am on vacation this week, sailing the Royal Caribbean's Liberty of the Seas. Normally I have contextual pictures in the post but on this ship internet access is fifty cents a minute, and it would cost me a lot of money of find 'em for this post. Even without that you can imagine how much this post cost!
PS: Yes, yes, vacation is not for blogging. Jennie does let me get away with a lot! :)