Got Surveys? Recommendations from the Trenches

CherriesQualitative analysis is very important, especially in our world where we Web Analysts lean perhaps a bit too much towards clickstream data as the source of almost all reporting and analysis.

In my post on importance of Qualitative data it was discussed why understanding the Why was critical (if you have not read that post it has foundational material that you might read before reading this one, please click here).

When you think Usability perhaps the thought of Lab Usability Testing comes to mind. Or maybe Follow Me Homes or Eye Tracking or other fancier methodologies come to mind. I am sure Surveys don’t, and but surveys, done right, can be a powerful method of gleaning key learnings and a great complement to other traditional Usability methodologies.

Yet if surveys are great at collecting qualitative data their impact on decision making is usually sub optimal. My hope in this post is to cover lessons learned over the course of a few years, lessons that might help you in your quest in moving from a survey monkey to a survey homo sapien (as most of you know survey monkey is a product and this reference is just tongue in cheek).

My top five recommendations for surveying success :

  1. Partner with an expert: Surveys are extremely easy to do (just buy a $19.95 per month license) and even easier to get wrong. Surveying not a art but a science, well maybe 20% art and 80% science. If possible partner with an outside company that can bring a full complement of expertise to the table. This has two great benefits:
    • You have to do less work, can’t beat that. Seriously though, you don’t have to create questions, you don’t have to master complex computations (often advanced statistics), you can benefit from best practices and all you have to bring to the table is your knowledge of your business.
    • You can stay focussed on value add analysis rather than in distracting and time consuming tactical work.

  2. Benchmark against industry: Perhaps the single biggest reason that survey recommendations are not actioned by senior decision makers is because they don’t have context. We run our survey, we measure questions on a scale of 10 or 100 and we provide a rating. Question: Do you like our website? Answer: 7 out of 10. Now is that great? Does it suck?

    External benchmarks are great because they give context, they shame us (when we are below) or give us joy and pride (when we beat the benchmark) but most of all they drive action.

    The ACSI, which ForeSee Results (see Disclaimers & Disclosures) has licensed from the University of Michigan, is one such benchmark. The American Customer Satisfaction Index measures half of the US economy and its various indices form an awesome global benchmark (available for free at www.theacsi.org, for example check out this one and see how your industry is doing). You can compare yourself for aggregate Customer Satisfaction but also for future predictive behavior (for example: likelihood to recommend website).

    Another company that is developing its own methodology and benchmarks by industry is iperceptions. They have developed benchmarks for Hospitality, Auto, Media, Financial Services, Retail and B2B. iperceptions also has a great built in slice and dice framework to “play” with data to find insights. I find this latter feature quite innovative.

  3. The real golden insights are in raw VOC: Any good survey consists of most questions that respondents rate on a scale and sometimes a question or two that is open ended. This leads to a proportional amount of attention to be paid during analysis on computing Averages and Medians and Totals. The greatest nuggets of insights are in open ended questions because it is Voice of the Customer speaking directly to you (not cookies and shopper_ids but customers).

    Questions such as: What task were you not able to complete today on our website? If you came to purchase but did not, why not?

    Use the quantitative analysis to find pockets of “customer discontent”, but read the open ended responses to add color to the numbers. Remember your Director’s and VP’s can argue with numbers and brush them aside, but few can ignore the actual words of our customers. Deploy this weapon.

  4. Target survey participants carefully: Randomness is perhaps the most commonly used tactic in inviting customer to take the survey. I, controversially, believe that that is sub-optimal in many cases. When trying to understand what people think I have come to believe that surveying people who have had the chance to engage with the site is optimal. It shows a commitment on their part and we will get better quality answers.

    Look at your clickstream data, find out Average Page Views Per Visitor and then set a trigger, Loyalty Factor if you will, for just under that number. The aim is to capture most survey respondents before an “average” visitor exits. You’ll get some that stay longer, that’s a bonus.

    You also don’t have to spam everyone with a survey. Usually around 1,300 survey responses are statistically significant. So for most high traffic websites if the survey is shown to around 3 to 5% of visitors who meet our criteria and if we get a 5% response rate then we are cooking. You can tweak the percentages to get 1,300 but the point is that you will only “bother” a very small percent of your site visitors. Trust me if you give them a chance to talk, they will talk. :~}

  5. Integrate with clickstream data:  You can turbocharge your survey analysis if you can figure out how to tie your survey responses to your clickstream data. If your survey technology vendor is a good one they will accept external variables that you can pass into the survey. Simply pass the 100% anonymous shopper_id and a session_id’s (or equivalent values from your website platform). These two values are also being captured in your clickstream data.

    When you do find “interesting” groups of responses in your survey response you can go back and put those anonymous shopper_id’s and session_id’s into your clickstream tool and see what the referring url’s were of these unhappy people, where did they click, what pages they see, where did they exit etc etc. (I can see that you dear blog reader are hyper ventilating at this point just thinking of the awesome power of such analysis, just imagine the insights that will fall magically into your lap, huge salary bonus here I come !!! )

    This was supposed to be a top five but here is one bonus recommendation…….

  6. Continuous not discreet: Most surveys are done as a pulse read, we have a thought in mind, a question we want answered, a project we are doing and we do a survey. That is great. But surveys can be hugely powerful as a continuous and ongoing measurement system. Having surveys continuously deployed on your website will mean that you will always have your finger on the pulse of your visitors. More importantly it will mean that you can account for seasonality, external influences (like press releases or company events), marketing campaigns, God’s power etc. All things that can mean that a discreet measure in time might not be the cleanest read.

    Advanced surveying technologies are now quite affordable, specially after you consider how much revenue your website is making or how many customers are being upset a sub optimal experience. Other than clickstream data this is perhaps the cheapest continuous method, none of the other Usability methodologies can even come close (either in their cost or in the number of actual customers you can hear from).

Surveys are not the be all and end all of qualitative data, like all methods they have their pro’s and con’s. But they are one of the best ways to drag the customer’s voice to the table to help us make better decisions.

If you don’t do them at all then all you are measuring on your websites is the what and not the why, and actionable insights come from understanding the why.

Agree? Disagree? Am I going overboard with this? Please share your feedback via comments.

PS:
Couple other related posts you might find interesting:

Comments

  1. 2

    When trying to understand what people think I have come to believe that surveying people who have had the chance to engage with the site is optimal.

    I think that the customers who leave early have valuable information on why they leave early, so why should we dismiss them? Doing so we certainly would get higher ratings, though… ;)

  2. 3

    Jussi: Yes customers who leave early have valuable information but there are other ways to get that information atleast initially: what campaigns lead to short visits, what referring urls or search key phrases, what pages etc. So clickstream gives clues, not optimal but something.

    When doing a deep experience understanding like ForeSee or iPerceptions some level of engagemement is good so that they can give a informed opinion.

    Doing so we certainly would get higher ratings, though…

    That does make me smile. :)

    Thank you for leaving your comments, I appreciate the feedback very much.

  3. 4

    I am actually approaching this from the other end as I work for an online survey company. Our feeling is that analytics are great and we are looking to upgrade our web analytic tool now. However, it can't tell you everything and together with a survey correctly placed and well-written you see the whole picture of your website traffic.

    Think the data you would get if you asked people why they are leaving without purchasing items in thier shopping carts. Asking your web traffic basic or detailed demographic questions. Also, people general feeling about your website/product and their overall experiences.

    This is a great site – keep it up! :)

  4. 5

    Hi Avinash – I could not agree more… and am glad indsutry experts such as you are endorsing the ability to "listen" to customers rather than "watch or observe" them only, or even better as you suggest integrate the two, to get a full overview of things. We do this as well and strongly believe and our normative dataset confirm it as well that "lsitening" to customers is key to drive relationship…and business. To this end, I also believe that gold is in open ended, but through what I call an "active listening" process. I just recently wrote in article in IMediaConnection that explains this further, in an other context, Word of Mouth (wom)…but website and wom, are related as engaged consumers tend to naturally come to the brand…on its brand website, thus the importance to listen and engage people back by listening and leveraging their insights…I am sure that Intuit fully relates…:)

    Still my question is, why "listening" is still not yet sufficiently used in the so called "web analytics" space, behavioral measurement dominating the stage (what I call "watching")? http://customerlistening.typepad.com/customer_listening/2006/05/when_will_web_a.html

    I hope your contribution and leadership will help change things!

    I hope to further exchange with you in the neart future.

    Thanks!

    Laurent

  5. 6

    Laurent, I think your question was rhetorical but I’ll respond anyways. IMHO the reason we are still listening is because it takes, again IMHO, fundamentally different "brain types" to analyze the What and the Why. Atleast thus far that has been the case. Maybe it is a right brain and left brain thinking that is required. Maybe, this is more likely, we have been indoctrinated by the field, vendors and our own decision makers that sitting long enough under the waterfall of clickstream data will yield all answers.

    But more and more people are realizing that Reporting is not Analysis and Clickstream is only, as you wisely put it, watching. This is a hopeful sign for all of us and we humble practitioners have a bright future ahead of us.

    My reason for coining the term Web Insights (and saying web analytics is dead) is a small step towards saying that the What is no longer enough, move towards understanding the Why and don't give decision makers reports, give them the output of true analysis of the What and the Why. Give them Web Insights. :~)

    Thanks for taking the time to comment, I highly recommend to my small merry band of readers your blog.

  6. 7

    Hi,

    Thanks in advance for accepting a comment on a post that's over a year old! I have read your book twice in the last week, subscribed to the blog, and become a big fan of your work – insightful and practical at the same time. And, lest I suck up too much, I'm very impressed to see that you are donating all of the proceeds from the book to two excellent charities.

    Two quick questions:

    First, is there any point in surveying visitors who bounce? Are you just going to irritate them more, or will some of them actually give you useful feedback on why they are leaving so quickly? I wasn't sure if this was part of your suggestion to only survey visitors who stay on the site for longer than the median number of page views.

    Second, I'm curious if you have any recommendations for inexpensive software to run surveys. While I agree that it is generally preferable to partner with an outside expert, most smaller companies don't have the resources to pay a full-service survey company, but I still think they can get value from using exit surveys. Any thoughts?

    Thanks!
    Cameron

  7. 8

    Hi,
    My 1st grade daughter is doing a science fair project on surveys and wants to do surveys as research. I am helping her search for the "science" behind surveying and why we do it. can you lead me to a good source that is easy to understand. So far your sight has been very helpful!
    Thanks
    Suzanne

  8. 9
    david zotter says

    We currently use RelevantView.com , does anyone know of any other good remote usability providers?

  9. 10

    David : I think there is one tool that is awesome at remote usability: Ethnio.

    One simple reason: Live Recruiting.

    All remote usability testing providers offer similar features, some are cheaper or more expensive and others still have cuter or not so cute icons. That does not matter. What is critical in getting relevant actionable insights is the ability to talk to your "real visitors".

    That is where live recruiting excels because rather than grabbing me off the street and telling me to imagine being a customer of yours, you can "grab" actual people who are on your site to actually solve their problems and then do remote testing with them.

    That to me is the key: being able to talk to your real customers about their real issues right when they have 'em.

    Hope this helps.

    -Avinash.

  10. 11

    I think I got the challenge for the next generation. Let me go down memory lane for a min. It all started with the local shop producing, say golf balls. You got in your 57' Chevy and drove down to "xyz Golf Balls" store where you walked in, was greeted by the clerk, and began to look around the shop. Upon finding hundreds of colors, velocities, and manufacturers, you raised your hand for the clerk to come and help you. [there is a point, I promise] The clerk finished what he was doing, walked over and gave you his/hers expertise and you made a decision. You chatted about the weather, maybe asked about the prices, but all and all, you communicated all the way through the moment when you pulled money out of the wallet and got back in the car and thought with excitement "how fun it will be to try out your new golf balls".

    Then the internet comes.

    One, after the next, hundreds of XYZgolfball.com shops open up. They all have product, they all have pictures, and they all have 800 numbers, but you don't know which to choose from. You get a referral from a golfing buddy that abcgolfballs.com is a good place to go, and you feel like you're on your way, but guess what? Excitement depletes as the site is confusing, price seems great but the check out cart isn't working, some pictures aren't up, and you think: “looks like this would be a great deal: if their store/site worked!”
    Companies have done such a great job trying to figure out when their customers come in, what they like to buy, and be ready to sell of the merchandise with just-in-time inventory. They forgot about one big thing – warm fuzzy feelings, customer attention, customer centric feelings that drive our desire to go out and Purchase, Network, Get Involved.
    You can survey to gage what customers will "think", but your goal should be to create a portal of companies “listening”. Communication in real-time on every page and voluntarily chosen by the customer brings the brick-n-mortar feeling back. Still it provides measurable tactics, just in a Customer Oriented fashion.
    It is about time that the "C" had a "V" in the VoC. Since the VoC is on the Web 2.0 innovation, should Web 2.1 bring "Ear of the Customer"? What do you think…and by the way Avinash, great blog, enjoy it very much. Read it daily.
    mb

  11. 12

    Great stuff. The ACSI changed their URL structure and forgot to think about old-urls.

    "example check out this one and see how your industry is doing" should be linked to: http://www.theacsi.org/index.php?option=com_content&task=view&id=18&Itemid=33

  12. 13

    Excellent post, Quick question on revisiting this post, with the current a state of web analytics being very rich and controllable in gathering and displaying information do you still think survey are a good way to gather the why?

  13. 14

    Hello Avinash and everybody!

    Currently we are running a feedback survey at http://www.ilovephotos.com/help/feedback/survey/ that is powered through Google Docs.

    Using Google Analytics I can track how many people visit the page, how long they stay, etc. Using Google Docs I see how many people have actually finished the survey.

    However…I have no idea how to measure they people that start the survey but don't finish it. Any ideas? How can I track the number of people who start the survey and then finish it?

    I know that the first recommendation would it be to create a "thank you" page but Google Docs does that already. (If you take the survey, you would see that once you complete it.) How do I add a tracking tool to the message from Google Docs?

    Thank you everybody!

    Cheers,

    Damian

  14. 15

    Hi Avinash,

    I ran into this older article and found it interesting and I agree with a few points, and disagree with others.

    However, what made me wondering the most was this:
    > Usually around 1,300 survey responses
    > are statistically significant.

    I was wondering how you calculated those 1300 responses and on which basis this is all calculated? Because statistical significance not only depends on the pure number of respondents. Usually you apply it to a difference in results (e.g. among subgroups) and see whether the differences among e.g. male and female respondents are significant (high chance that the difference has a reason behind it) or we can not tell whether there is a reason behind it or it was just and deviation due to chances…

    So I am wondering on what you are basing the affirmation that 1300 responses are statistically significant and especially for what?

  15. 16
    Marianne Conklin says

    Hi – WA and Market Research are a great mix to completely understand a customer's experience. My question: how do you merge collected survey data with analytics data? Can this be done with GA?

  16. 17

    Marianne: The problem with integrating survey data with analytics data (from any tool) is that your visitor session happens and causes two different sources of data. Your survey data (with your survey provider) and your clickstream data (with your WA tool).

    The challenge is to merge these two things together.

    Here are your choices:

    1. Data must come into the WA tool from the survey tool
    2. The reverse of above.
    3. Come out of both tools and into a new place (spreadsheet, database etc).

    All of the above allow you to analyze the merged dataset of clickstream and survey data, IF you have a primary key.

    The most common method to have a primary key is to

    1. Pass the unique survey id into the analytics (Google Analytics or whatever)
    2. Pass the session or persistent cookie id into your survey tool.

    All that is context.

    With the first set of choices you can do #2 and #3 with GA.

    For the primary key you can do #1 or #2 with GA.

    In either scenario there is some technical work you (or your hired consultant) have to do to make this work, with Google Analytics or any other tool.

    -Avinash.

  17. 18
    E. David Zotter says

    I ended up using RelevantView for several large projects with great success.

    It handles the merging of both survey and web analytics automatically.

  18. 19
    Scott Gilkeson says

    Interesting and long-lived discussion. But I haven't seen anyone talk about what I perceive as the biggest problem with surveys–that they are from a highly self-selected sample. I know this is true about all sorts of surveys, not just online ones, but it seems like respondents are likely to be either fans, or have an ax to grind, or maybe be people with a lot of time. I like the idea of targeting regular visitors – that might help somewhat. I don't think any sort of incentive would help–may make things worse. I'm especially interested in non-commercial sites, where typical (target) users are looking for information. Pop-up surveys tend to interfere most with visitors who are on-task, and those are the ones whose feedback is most desirable.

    I get so fed up with ForeSee Results surveys popping up when I use government web sites that I've taken to dismissing them all in protest. Any thoughts on ways to get to the core, task-centered, busy visitors?

Trackbacks

  1. […] Got Surveys? Recommendations from the Trenches — "…but surveys, done right, can be a powerful method of gleaning key learnings and a great complement to other traditional Usability methodologies." […]

  2. […] I have written about the importance of online surveys on this blog in the past; and Avinash provides some very interesting observations about surveys in his latest post, Got Surveys? Recommendations from the Trenches. […]

  3. […] Currently I am working with my main web developer at my new job to create event tags on main functions of one of our sites to find out if they are actually using certain features or not. This requires man-hours, testing, implementation, and this method is not scalable for a massive site (we’re only measuring clicks on the home page) – and that’s just to measure a behavior, not to add a new feature/functionality/content! In the townhall format, they ended up with a much more dynamic and democratic version of a user survey from real users. (extra reading: Kaushik.net – the importance of surveys) They also appear to be more open to the public (in addition to diggnation) which builds trust from the user base that they are actually doing it for “us”. […]

  4. […] On your blog and in your book, Web Analytics: An Hour a DayAvinash Kaushik's Web Analytics An Hour a Day, you’ve talked about 6 recommendations for conducting surveys. 4Q does a great job on most of those. One thing I’m missing is how to tie this data to your regular clickstream or behavioral data. How can site owners integrate 4Q most effectively with their behavioral analysis to create a complete picture of their customers’ experience? […]

  5. […] Surveys and testing and experimentation are some of the best ways to understand visitor’s experience. In my mind, the third thing is most important. It’s nice to know that a student spent an hour looking at online learning content but we’d really love to know why they did it. Did the student spend the hour because they were studying for a test? Were they finishing an assignment? Is the material helpful? What material is most helpful? Do they enjoy learning with the material? […]

  6. […] Given this weakness of traditional web analytics, more and more web analysts are turning to online surveys as a supplement to behavioral data (see here and here). Such surveys enable you to ask the visitors directly instead of guessing from their clickstreams. If they are conducted continuously, you can even use them to include opinion scores in your dashboard alongside your conversion rate and other traditional web metrics. […]

  7. […]
    Here’s a useful resource about creating surveys: Got Surveys? Recommendations from the Trenches, May 2006, by Avinash Kaushik, an expert and author on Web analytics.
    […]

  8. […] Got Surveys? Recommendations from the Trenches – Occam's Razor by Avinash Kaushik (tags: analytics webdesign) […]

  9. […]
    An easy way to do that, if you can’t fully reform the question, is simply to ask the person to elaborate. “Did you find value in this process?” If so, please explain further. If not, tell us how we can improve.” Avinash Kaushik really summed it up well in this article:
    […]

Add your Perspective

*