Eight Tips For Choosing An Online Survey Provider

AnemoneMichael asked a great question the other day on the "best survey questions" post and I started to write a reply back in the comments section but it turned out to be too long so here is the answer as a post.

Michael asked:

I'm interested in online survey capability. In terms of selecting a vendor for this, what are the important capabilities you should look for?

I assume that you need to be able to specify when and where the survey would be asked, and that this would need to be done as folks were leaving the task you want to measure? Is that right, and is there anything else that you feel is important?

I'll answer this question in two parts. First I'll set some context about surveying and the second part will be my humble attempt at providing tips to Michael about things to consider when buying a survey solution.

Part One: Pre – Pre Survey Solution Considerations

Before you go anywhere I feel like it is my duty to share three important things that you should seriously think about / be aware of….

1) You can do surveys on your own using surveymonkey or zoomerang etc, they are cheap and you can do things yourself. You don't get all the advanced features, controls and the analytical horse power to parse out the numbers. But can't beat the price and you can be up and running for as low as twenty bucks a month.

2) You can engage with a external partner to do this, iPerceptions, ForeSee etc. They come with many advanced features and lots of different ways in which you can target the survey. They do cost more. You have to determine the fit.

Here's a tip, both these companies want to sell to you (why would they not). Call 'em up, request them to present to you the glories of their technology (and ask them what they think about the other). Amazing learnings will follow for you. At they end you may or may not buy but you'll be smarter after that hour investment with each. Watch for specifics about their technology as much as their attitude.

3) There are two types of surveys. Page Level surveys and Site Level surveys. They are both very different and solve for radically different needs, they should not be confused as being the same.

Page level surveys are user initiated ("tell me what you think of this page" "rate this page" etc) and serve the purpose of measuring the experience of the page. You can ask "site" level experience questions but the way they are initiated by users and local level at which initiation happens makes them a sub optimal choice for measuring site experience. They can good for measuring effectiveness of a page.

Site level surveys are usually presented to the users ("please give us a few minutes to answer these questions"), typically on exit, and are a great measure of the site experience ("what got you here today" "what were you able to do" "how much did we suck" "did you find what you were looking for"). They are good at measuring effectiveness of a site.

site level or page level surveys?

It is important to be aware of the difference and be smart about the choice you make, I often run into companies that deploy page level surveys to measure site experience and then are surprised that changes have not impact on sales or improvement of satisfaction. Or vice a versa. [If you want to get deeper into this there is a lot more about this in my book.]

Part Two: Pre Survey Solution Considerations

Ok so have done some preliminary soul searching. You have also invited the survey companies to come pitch to you, at the end of which you are not much wiser. :)

From my personal experience here are things that you should think about / consider before you pick the vendor of choice….

#0] Just in case you skipped to this part of the post (!), the first thing to do is read the above and make those decisions!

#1] Mathematical Rigor: No matter what you choose look for a partner that can apply mathematical rigor to your results. I find so many results mis-interpreted because poor math applied in the analysis. Measuring survey results is not simply taking the average of the answers (averages lie!), it is measuring distributions and doing regressions. You don't want to be bothered to apply the statistics and statistical significance, stress test to ensure your vendor does (then you can focus on analysis and not reporting).

#2] On The Fly Segmentation Capabilities: Look for the capabilities that are provided to do on the fly segmentation of your data (aggregates lie!). With some vendors you get no segmentation capabilities, with others you'll get segmentation "on demand" (your ask for it and they'll send it to you later), and with others you get, on day one, access to a online environment where you can slice and dice your data at will.

One way or the other you want to be able to segment the data quickly and efficiently.

There is gold in your ability to pick a particular segment of traffic that absolutely hates you and slice it off and drill down to why they were on your site, what products they own, what did they not find, and… then go fix it fast. Segmentation rocks! You want to be able to do it efficiently, yourself if possible to reduce chances of vendor delays.

#3] Benchmarks & Indexes: Few people in senior management will take action when you tell them "our score on satisfaction (or content or navigation) is 6.0" (or 45, depending on your vendor). But most of them will get off their butt and give your money to take action when you go to them and say "our score on satisfaction is 6.0 and amazon.com is 9.4 out of 10".

Your actual score does not drive action, the difference between that and a industry benchmark drives action (no one wants to look bad by comparison!). And here is another subtle human factor: no one wants your opinion about anything, providing a comparison to a external benchmark depersonalizes the number and it is more likely that it will be "heard".

iPerceptions uses the iPSI (developed by iPerceptions), ForeSee user the ACSI (developed by University of Michigan), both wonderful benchmarks you can compare your scores to and help motivate your management to take action! There are other vendors, look to see if your vendor will provide you with a benchmark or a index to compare.

#4] Open Text "Categorization": No vendor is really awesome at this because it is a really hard problem to solve. Ask your vendor what kind of capabilities will they offer you to parse and categorize the open text responses (where the real gold is in terms of actionable insights).

If they say we can do it all for free and instantly, take that with a grain of salt and ask them to demonstrate it on your website's data.

Remember this is hard to you, and you will want it desperately, and you want to do with the best amongst some less than optimal choices (and not due to the vendor's fault, they are all trying).

#5] Survey Invitation "Type": There are several ways to initiate the survey. When the customer visits, on exit, during session, pop over, pop under etc etc.

In many cases initiating the survey on exit works optimally. A derivation of that method which also works well is that you can invite people when they see the first page on your site and show survey on exit, that works as well.

Stress test your vendor to see what methodology they use and if you can try a couple different ones. Remember what works for your customers might not be the same as my customers. You want to be able experiment.

#6] "Visitor Memory" / Cookie Sophistication: You want to look for companies that have a ability to set cookies so that once a visitor has been served a survey they won't see another one for, say, 90 days, regardless of if they fill it or not. Last thing you want is survey fatigue.

You can also partner with companies that allow you to survey only certain types of customers, say those that have seen x number of pages or those that came from zqi.com or those who…. you get the idea. Ask if your vendor provides you with such sophistication. Some do and others don't, depending on your need this may or may not be important.

#7] Integration With Web Analytics: Does your survey vendor allow you to integrate with your web analytics (clickstream) tool? Ask that question and ask them exactly how they do it (and get one of your smart IT guys to look at it with the FUD detector turned high).

After a while of doing surveys you will want to know: "what did my most satisfied visitors see on the site" "what campaigns / keywords drove the most unhappy traffic" "how do people who see the recommender tool in my clickstream data feel about likelihood to recommend" etc etc.

Integrating your survey tool with your web analytics data is not very hard. You should do it. You just need a friendly vendor and a basket of muffins (to give your IT person).

Pilot Friendly: When you ask the above questions and make a choice it is quite likely that you'll deal with marketing and sales. The best way to learn about reality is to do a pilot. If your vendor is any good (and of course you are of a decent size!) the vendor will be willing to do a pilot with you, say for a month or six weeks.

If they refuse the pilot that will tell you something about them. Even if you have to pay a tiny amount up front to do a pilot it is actually a very good idea to do one. Not only to learn if the tool works for you but to, perhaps more importantly, to see if the people from the vendor are people you can work with.

Remember you rarely buy tools, almost always you are buying a relationship.

That's it, eight very simple things to look for as you choose your surveying vendor.

It is important to point out that the kind of rigor that is being recommended above you typically won't apply before signing up for a surveymonkey account. At $20 per month that is a significantly easier decision to make.

But if you are going to unlock all the power of surveying for learning about your customers and really having a solid "listening post" then you would want to stress test the above eight criteria.

[Disclosure: I am on the advisory council of iPerceptions, one of the companies I have mentioned in this post.]

What do you all think? Please share your tips, tricks, war stories, critique, brickbats via comments.

Couple other related posts you might find interesting:


  1. 1

    Nice post, a good sum up of the things you need to look out for. I would just offer two comments:

    1. Mathematical Rigor – yes, averages lie. But so do regressions. And so do significances. Statistics is about interpretation with common sense. Do apply these techniques, but don't fool yourself that the more sophisticated the technique the better the result.

    2. "no one wants to hear your opinion" – I've found the opposite to be true. You need to offer an interpretation of industry norms. Comparing yourself to Amazon's scores requires an understanding of how Amazon got there, when and why you are probably different.

    Good post though, agree with all the rest.

  2. 2

    When I needed to do an overhaul of my online boutique, I used the marketing research of Anderson Anayltics. AndersonAnalytics

    They did an amazing job with data mining and segmentation. It was very facinating. I use the data they put together all the time. I have 3 very different target markets that all get different messaging. I helped as they developed the survey, etc. They had all the contacts for executing it.

    I am still new to analytics (on the deeper level anyway). I would prefer to be able to pay someone to explain what information is valuable, which is not, as well as what I actions I should take based on the data. But, that is a bit unrealistic. I just visit this blog regularly to try and grab a few tips here and there.

  3. 3

    I find it pretty cool that you can actually make sure people who've seen a survey before won't be shown another one during the next 90 days!

    When you said management won't take action if you simply tell them we have a satisfaction score of 6, but they'll be more likely to take action if you can tell them "we have a quality score of 6 and Amazon has a quality score of 9.4 of 10", I think that's one of the handful or so major copywriting principles (Self-interest aka nobody wants to read your diary, but they want to get something out of your blog was another one remember ?;)): "Be specific". If you mention precise numbers to back up your claims it simply sounds more convincing!

  4. 4

    Great post (as always!)

    I completely agree with your points regarding aggregation and web analytics integration. Talking to a client recently, they were receiving comments regarding their search function not working. They had tested this, and thought everything was working. However, if they intergrated with their Web Analytics, they would have found that the users complaining about the search were all using Fire fox, which was (at the time) not compatible with their internal search. They had discounted the additional development because there was only a few percent of user using FireFox. However, the survey showed them, that although the overall firefox usage was a few percent, those users were in fact highly desireable clients that were currently being completely excluded by this site.

    This shows the importance of segmentation, along with considered metrics such as visit/visitor value, to help identify causal relationships, rather than basic correlations.

    We have also found that the differing types of survey; basic link; layer based; self-select, etc.. at extremely important, as many of the current layer based solutions break basic accessility guidelines, and so need careful planning.

  5. 5

    In a 'former life', ForeSee was abandoned because they didn't ask the right kinds of questions, iPerceptions is great (and has a fabulous interface) but requires a second round of investment to link their data to the transactions. The clear winner in this space (for my needs) is Usability Sciences WebIQ. Currently the service is 'pitched' to come with its own analyst (they offer some VERY heavy analysis), but they are in a market response mode right now and have considered alternative offerings. Their best technology is how they capture transactions along with the questions (which can be customized).

  6. 6


    Well i do have to say, that in all the work that you and Intuit did with ForeSee Results and the American Customer Satisfaction Index…you did learn a lot. However, your education is not yet done : )

    I am going to narrow the topic from survey to measure satisfaction, in other words, does the site meet your needs…exceed your expectations, the best performance metric your website can have.

    Let's focus on mathematical rigor. You are on the right track, but it is more than applying mathematical rigor. It is applying a methodology of measurement that is accurate, precise, reliable and predictable. And has been proven to be so. How do we prove it? When talking about measuring satisfaction, satisfied customers will be more loyal customers, who recommend your site to others. So, satisfaction should be able to predict financial success. The American Customer Satisfaction Index is the only methodology that has been proven to work and that evidence has stood up to the academic and scientific community's scrutiny.

    I can go off and create the Larry Satisfaction Index, write a white paper on it, and call it a methodology. But if it has not been proven to work…it is of little value.

    Your metrics, your bemchmarks and the insight you gather from your measurement is only as good as the quality of your measurement. A bad measurement methodology leads to bad conclusions. Garbage In– Garbage Out.

    I could go on…but I will save that for another day…and another blog (www.freedyourmind.com)

    -Larry Freed
    ForeSee Results

  7. 7

    Thanks Avinash, that's very useful.

    You've talked about using page level surveys for getting feedback on the effectiveness of pages, and using site level surveys for feedbacak at a site level.

    What about getting feedback, in terms of your three most important questions, at a task level? What are the right approaches and tools here?

  8. 8

    Hi Avinash – thanks for a good post. I agree with you that it is crucial to use surveys as part of the marketing mix. For example, one website had nearly 50% response rate – admittedly on a confirmation page – but still a goldmine of (your words not mine) "voice of customer".
    A hard thing to do is organise free text responses effectively when there are alot of them in my experience.

  9. 9

    Michael : For the three golden survey questions I would simply hack up a survey using my IT guy or one of the cheaper tools.

    I would recommend that it be a on-exit survey that is site level. Page level surveys are rather poor at collecting this type of "site experience" data, both because of survey invite type and "local level" presence.

    Hope this helps.


  10. 10

    Hi Avinash,

    Really wanted to thank you for this illuminating piece. Your reference to the iPSI as a “wonderful benchmark” is most kind. Moreover, you’ve concisely and persuasively exposed not just the advantages endemic to the medium we use to survey actual online customers, but also the opportunities than lie ahead.

    It’s a tremendously exciting time for our sector. The Internet has literally liberated so many voices that had previously been silent within the stricter confines of offline market research. In a sense, we’re all still doing our best to grapple with this wealth of new information. Online research is a wide-open space, and we are all learning that online customers are wonderfully complex. It makes the workday both challenging and stimulating for all divisions of our company.

    What we stress, however, and what the iPSI leverages is our company-wide commitment to 100% transparency. Data updates every 30 days are nice, but if you can’t track movements day-by-day, even hour-by-hour, you’re being robbed of critical decision support material. We are not spin-doctors; we are conduits for the voices of our clients’ customers. This manifests itself in our unobtrusive sampling method and our wealth of real-time data housed on our online research portals. As such, we obtain high survey completion rates on the visitor end, and clients perceive us as honest, rigorous, and trustworthy messengers of their customers’ needs and wants.

    Without getting into too many specifics, I’d like to quickly address your point #4, where you speak about the problems plaguing open-text categorization. You are right on target in saying that the “real gold” is buried here. Time and again, we see our clients reacting more viscerally to a well-written comment than they ever do to dry numbers. I firmly believe that this will be one of the next big areas of competitive differentiation between survey vendors. To that end, iPerceptions has launched an ambitious project to tie open-text commentary back to our perceptual framework, adding another layer of credibility to the iPSI. The results thus far have been extremely promising.

    As I mentioned above, our space is wide open and new best practices are emerging every day. To that end, collaboration and the sharing of intelligence can be highly important. iPerceptions is a proud member of Omniture’s Genesis network, along with many other web analytics vendors who are onboard with this initiative to tie together clickstream data and survey analytics.

  11. 11

    After reading your article with great interest, here are a few additions based on my experience in the business.

    One of our specialties is aimed at website profiling and satisfaction measurement, which our partners use to target online advertising. Thanks to these partnerships, here are some more points of concern that some of my clients have expressed, relating to research bias.

    Research bias will never be completely excluded, but vendors have to try their utmost best to eliminate any kind of influence. Thanks to some R&D efforts in this field, following tips should be best practice.

    Permanent measurement is almost a necessity, in order to track evolution, measure the impact of your changes / updates and being able to compare periods and perform historical lookups. So whenever you get the chance, aim for a permanent setup. And moreover, you don't want any seasonal patterns influencing your dataset, do you?

    Don't use links and motivate users through a news post to particpate and provide some feedback. R&D clearly proved that this will indeed deliver some sample to work with, but a closer look at the results will immediately display a certain level of polarization. Why, do you ask? Because this will only motivate the extremely satisfied or dissatisfied visitor to finally express their thoughts and let you know what they think.

    So don't offer a link, but try to acquire a representative sample by inviting for example each 5th or 10th visitor. Don't offer them a simple link, but create an eye-catching invite and incentivize their efforts by letting them enter a sweepstake, or donate somthing to charity for each respondent. Also make sure that the incentive is as neutral as possible in terms of all possible socio-demographical segments, and fits every gender, age, income, education level, etc.

    And a last tip, would be to definitely include a question about site visit frequency, and use this variable to segmentize the sample. Because you really don't want to start with throwing your approach around, based on the opinion of a sample that consists for 80% out of first-time visitors :)

    I work at InSites Consulting, a next generation European market research agency. I specialize in online marketing, advertising and research, and work together with 50+ leading online advertising networks, and more than 50.000 websites rely on our consultancy and research. Get in touch with me through shahways.romani[at]insites.eu!

    Cheers, Shahways Romani

  12. 12
    Matt Butler says


    We are looking for reliable code that can execute the pop-up on exit survey invite. Surveymonkey provides only pop-up on site entrance. Specifically, we'd like to target those that exit during the checkout procedure. Do you have any reliable examples of the pop-up on exit? This seems to be a challenge given the built-in pop-up blocker in most browsers.



  13. 13

    We have been looking for a survey tool that is more customizable in a design sense. We would like to use a tool where the feedback collection mechanisms are friendly and conversational. Has anyone found one out there?

  14. 14

    Great Post. I learned some new tips from the original post and readers comments. Web Analytics is my weakest link and your post enlightened me with new concepts that I can practice. Thanks

  15. 15

    This is really a very interesting post. However,I am wondering, is there any website, blog, article or book that reviews all the different survey tools available in the market and offers a fair comparison of the different features, advantages and flaws?

    One particular feature I am looking for is the Web Analytics integration. Is there any tool that integrates with Google Analytics?


  16. 16

    Jose : There isn't a authoritative post / site that I can recall. I know that some Analysts at Gartner, Jupiter have written papers from time to time. I would seek them out.

    This is a very new space, iPerceptions and Forese are the two bigger girls (ok maybe boys!) in the space. Then there are a number of smaller providers like surveymonkey, zoomerang etc. Finally there are bigger DIY's like confirmit.

    The nice thing is it is easy to test out the free / affordable services. It will take you just two calls to reach out to each of the two big girls and see how good they are. If it will help I recommend glancing at this post and getting your questions ready. :)

    Hope this helps a bit.


  17. 17

    Do you know whether Survey Monkey supports Google Analytics? Thanks.

  18. 18

    Or does Survey Monkey, or any other survey tool support integration with Google Analytics, so as to tie the results of our survey to clickstream data and do some segmentation work?

  19. 19

    Jose D. / Zig : I asked surveymonkey about this and here is their answer (from Karen Wilson):

    Google Analytics is a function that we pay for to monitor what our respondents and clients react to in our services. This is not a feature that can be used on the client side and you will have to research it independently if you wish to use it in this fashion.

    My question was: "Could you please let me know what you mean that you integrate with GA and how can I make use of it on my survey monkey account?" (! ! !)

    If you want to integrated your survey results with GA then best option is to pass a common parameter between both. So for example you can pass the anonymous cookie id to the survey and to GA and then use that to tie.

    Another alternative is if your survey has a unique value that identifies it then pass it as a "user defined value" into GA, which also allows you to tie the data and slice and dice.

    Hope this helps.


  20. 20

    This has been extremely helpful, thank you. Have you seen any good examples of incremental surveys – one or two questions at a time in a pop-up?

    Thanks again.

  21. 21

    no one seems to have mentioned LimeSurvey as an option .. this is the successor to PHP Surveyor and is free, easy to deploy and to modify. If you have special survey needs, you might look into it

  22. 22

    What are some of the survey metrics that we should measure? We are implementing survey but not sure what we can achieve from it as far as actionable items are concerned besides usability.
    Can you list some survey metrics. Does 4

  23. 23

    Does 4q integrates with site catalyst or GA?

  24. 24

    What would be the best online survey provider that contains all of these features:

    -Ability to see individual results and know who completed survey
    -Dashboard of results (tracking feedback by question/carrier/leasing associate)
    -Automatic reminder to go out if a survey has not been completed
    -Ability to see who has and has not completed survey
    -compatible with SALESFORCE

  25. 25


    I highly recommend Rational Survey. It gives you the ability to view responses on an individual and group basis. You can also see which of your study participants have completed your survey and the number of responses they've submitted. You can also reuse existing questionnaires and questions in new surveys.

  26. 26

    Very well written post and very helpful.

    It's funny because I disagree with the first comment when Paul says he disagrees with you about expressing your opinion. That was the main highlight of your post for me.

    I agree with you that people most likely to hear you when they themselves think they have come up with the decision. You tell somebody "I think you should…" and they instantly tune out because they're too busy listening to the voice of greatness emanating from their own heads about their own precious thoughts.

    If you give them a reason to come to the same decision you have by asking them to compare to a benchmark by making them they think they thought of it themselves, you win, because when someone is listening to the their own inner voice, they're much more likely to do something.

    People are narcissistic this way and you gotta do whatever works.

  27. 27
    Patricia Adams says

    Great article.

    I especially liked the section on Benchmarks and Indexes.


  1. […] Avinash Kaushik wrote a great post this week on how you should choose an online survey provider. Avinash worked extensively with ForeSee Results, so he’s very familiar with the ACSI methodology and how we apply it online. Since he’s been both a customer of web metrics and a partner of web metrics companies, he has a lot of knowledge and has given a lot of thought to what kinds of things people should consider when choosing a tool. I just want to expand on a few of Avinash’s points:[…]

  2. Surveys…

    \\ Surveys help gain broad knowledge of user needs and responses.  In addition to targeted surveys, we recommend regular customer satisfaction surveys to gauge the overall trends in customer experience.  If shorter,……

  3. […]
    You could probably answer that by looking at Time on Page, but the best way would really be to look at utilising surveys. Exit polls are actually a great way to assess user satisfaction with your website, and I’m not going to go any further on this because Avinash Kaushik consider this in his excellent article “Eight Tips For Choosing An Online Survey Provider”.

  4. […]
    This seems to be the key to it: determining exactly type of feedback is needed, the extras you need (eg displaying visual identity) then weighing up the options. Once again I turn to Kaushik, who shares some good tips when considering the best way to run a survey (see “Eight Tips For Choosing An Online Survey Provider”)

    I suspect that you need a fair few visitors to generate feedback in any appreciable quantity. I haven’t seen anything about a rule of thumb, but I suspect the ratio of all visitors to those that give feedback is quite high.

  5. […]
    What’s more, their providers do not want to waste money and send surveys to people who are not even their potential customers. As a rule, there are numerous filters that sort respondents out using certain parameters: gender, age, location, interests, kids, automobiles, pets, etc. No matter how hard you try, you will hardly be able to match demands of every survey provider. There will always be filters that leave you aside. Still, it is not a reason to feel desperate. The more details about you that providers of surveys can find in your profile, the more surveys will arrive at your email box.

Add your Perspective