Qualitative Web Analytics: Heuristic Evaluations Rock!

Fresh Every believer in Web Analytics 2.0 knows that awesomeness comes not from answering just the "What" question but from also answering the "Why" question.

What comes from Google Analytics, Adobe Site Catalyst, WebTrends, CoreInsight / NetMetrics and more.

Why comes from lab usability studies, website surveys, "follow me home" exercises, experimentation & testing, and other such delightful endeavors.

Why gives context to the What, and delightfully helps you not have to overlay your biases when you try to infer visitor intent form all the What (clickstream) data.

I know that you agree Why is important.

I know that you even realize Why is ever easier to accomplish (usability studies are economical, surveys and testing platforms start at the sweet price of free!).

Yet your site stinks like a skunk.

The reasons are complicated.

You are smart, so that is not it. Maybe it is internal politics. Maybe it is the agency you have outsourced the site to, the agency whose only competence seems to be gratuitous use of flash. Maybe it is that it is not your job, you are the "quant" guy or "GA girl". Maybe even after taking one of the team and going out on three dates the IT Dude still refuses to put Website Optimizer tags on the site. Maybe the well meaning but "never met our real customers" HiPPO dictates site design.

Bottom-line: Your site stinks and you need to fix it.

Allow me to introduce you a User Centric Design that is, I think, the solution you have been waiting for: Heuristic Evaluations

I love heuristic evaluations because they are cheap, fast and you probably already have resources you need in your company. A large part of my adoration also comes from the fact that heuristic evaluations are us going back to the basics in an attempt to create un-stinky websites.

What Are Heuristic Evaluations?

A heuristic is a rule of thumb. In as much, heuristic evaluations follow a set of well established rules (best practices) in web design and how website visitors experience websites and interact with them.

When conducting a heuristic evaluation a user researcher (or an HCI expert) acts as a website customer and attempts to complete a set of predetermined tasks related to a website's existence. For example: Trying to place and order, or looking to find out the status of an order, or the solution to an error code, or decide which of many products on a site are optimal for a specific customer persona.

But here is the lovely part, and why almost anyone can perform heuristic evaluations. in addition to best practices the researcher (or you!) will raw from their own experience, knowledge and common sense.

Heuristic evaluations are best when they are used to identify what parts of the customer experience are most broken on your website. They can be very beneficial if you have not yet conducted any usability tests or when you would like to have a quick review of prototypes that the designers might be considering.

In either case, you can quickly determine the lowest hanging fruits in terms of "broken" parts of the customer experience. With this feedback there can be iterative improvements to the customer experience. You'll probably already have connected the dots and realized that this is a fantastic way to identify ideas for A/B or Multivariate experiments on the live website.

[I Heart] Group Heuristic Evaluations

There is one more thing, a way to amplify the impact and get even better results.

Get everyone involved!

Heuristic evaluations can also be done in groups!!

Invite people around you with key skills, such a designers, information architects, web analytics professionals, that girl in accounting you really like, other analysts in the company (and their quantitative understanding of site data), search experts, the intern who only communicates via Posterous, and so on and so forth.

The goal is simple: Identify flaws by attempting to mimic the customer experience (if possible under the stewardship of a User Researcher, if not then under the gaze of your haunting brown eyes) by completing the tasks on the website as a custom.

The great benefit of the group heuristic evaluation method is that you can tap into the "wisdom of crowds". This is especially powerful because the web is such an intensely personal medium, and the group members can offer different points of view that highlight issues quickly.

The process that worked optimally for me was to send an email to 10 or so folks (a diverse set!). Invite them to a noon meeting in a largish conference room and order lunch for them (best $50 I ever spent). Once everyone was settled (by 1205!) project the website on the screen and try to complete the most common customer tasks.

I (or you) have to do a good job of moderating the discussion and ensure everyone participates. There is no such thing as a bad opinion, diversity is good. Collect all pertinent feedback.

Heuristic evaluations can provide valuable feedback at a low cost ($50 in my case) in a very short amount of time (an hour in my case) and identify obvious usability problems. Hence they are best for optimizing work flows, improving user interface design and for understanding the overall stinkiness (or lack thereof) of the website.

Here is another subtle benefit of the group evaluations: improved communication and, dare I say, camaraderie between different groups in your company (big or small).

There is nothing that quite brings people together like a,  pardon the expression, bitchfest. Everyone contributes, everyone commiserates, everyone loves it. Next time they are doing something they'll know what you do. Next time you need help, you'll know who to call (and they'll pick up the phone!). It is a great way to bring a sense of common purpose and a sense of ownership to improving the website experience.

Conducting A Heuristic Evaluation (The Glorious Process!).

You're excited right?

Here are six steps to conducting a successful evaluation process, either when you do it or you are doing it as a group:

1. Write down the tasks that customers are expected to complete on the website. If you are using surveys on your site (even a simple site level survey like 4Q from iPerceptions or page level survey like Kissinsights) then that is a fantastic source of this information. You should also, if possible, talk to the site owner. Here is what you might end up with on your list:

  • Find information about the top-selling product on the website.

  • Locate a store closest to where the customer lives.

  • Place an order on the website by using PayPal. (If the website doesn't accept PayPal, how easily and quickly can a customer find that out?)

  • Sign up to show up at a protest march against taxes on the richest Americans.

  • Successfully contact tech support via email.

  • Pick the right product for customer profile x (where x can be a small business owner or a family of four or someone who is allergic to peanuts).

2. This is sometimes hard but try to establish success benchmarks for each task. For example success rate for placing an order for the best selling product = 80%, signing up for the protest march = 99%, contact tech support = 90%, etc etc.

3. The fun part. Walk through each task as a customer would and make note of the key findings from the experience – everything from how long it takes to complete the task to how many steps it takes to the hurdles in accomplishing the tasks to how profound your embarrassment was that this was your own company's website.

4. If you were using a best practices checklist (more on this below) then make a note of the specific rule violations.

5. The hard part. Create a report of the findings. You can use PowerPoint with a screenshot of the webpage and clear call-outs for issues found. Or you can use Camtesia / a screen recording software to capture the session (and the group discussion). This can be distilled to a "the best of the bitchfest" collection for your superiors.

6. The hardest part. Categorize the recommendations into Urgent, Important and Nice to Have. We all get swept into emotions fervor. It is also possible that when you present your findings to your Sr. Management they might be a bit HiPPOish (not that there's anything wrong with that).

You want to go in with the Urgent, Important and Nice to Have based on impact on the customer experience and the company bottom-line. This helps drive a "what we should do" discussion rather than "I think we should do that" discussion.

That's it.

The process, as always in web analytics, is important and I hope the above six steps help you create a process in your company that is repeatable and yields impactful results.

I want to stress again that this is a great way for you to get into the customer's shoes, for your to build camaraderie, involve the cross-functional team of people, and finally find the lowest hanging fruit for sure and perhaps even some big ones.


The Cheapest Heuristic Evaluation Exercise.

I have hinted about the cheapest possible heuristic evaluation exercise a couple of times in this post.

It is: You sitting down with your common sense and a list of "best practices" and checking how well, or badly, your website does against that list. Then do steps 5 & 6 above.

This is fast and impactful. Even in the worst case you identify the most broken things / annoyances.

I have used lots of website best practices usability checklists over time and have settled on using Dr. Peter Meyers's 25-point checklist. It is simple, effective and quite expansive. You can download it here. [If you' have Web Analytics 2.0 then you'll also find an "extended edition" on the CD that is attached to the back cover of the book.]

The usability checklist has four sections. Here's a brief summary:


    1. Site Load-time Is Reasonable
    2. Adequate Text-to-Background Contrast
    3. Font Size/Spacing Is Easy to Read
    4. Flash & Add-ons Are Used Sparingly
    5. Images Have Appropriate ALT Tags
    6. Site Has Custom Not-found/404 Page


    7. Company Logo Is Prominently Placed
    8. Tagline Makes Company's Purpose Clear
    9. Home-page Is Digestible In 5 Seconds
    10. Clear Path to Company Information
    11. Clear Path to Contact Information


    12. Main Navigation Is Easily Identifiable
    13. Navigation Labels Are Clear & Concise
    14. Number of Buttons/Links Is Reasonable
    15. Company Logo Is Linked to Home-page
    16. Links Are Consistent & Easy to Identify
    17. Site Search Is Easy to Access


    18. Major Headings Are Clear & Descriptive
    19. Critical Content Is Above The Fold
    20. Styles & Colors Are Consistent
    21. Emphasis (bold, etc.) Is Used Sparingly
    22. Ads & Pop-ups Are Unobtrusive
    23. Main Copy Is Concise & Explanatory
    24. URLs Are Meaningful & User-friendly
    25. HTML Page Titles Are Explanatory

Seems simple right?

I bet your website currently breaks 10 of the rules above. It is hard to believe. Set up a quite hour aside. Go through the checklist. But first go download the detailed checklist at Dr. Pete's website.

When you are done remember to do steps 5 and 6 of the recommended heuristic evaluation process outlined above.

Benefits of Heuristic Evaluations.

In case you somehow made it here and were not convinced of the value doing heuristic evaluations here is a quick summary of the benefits:

  • Heuristic evaluations are extremely fast to perform, with a very short time to insights.

  • They leverage existing resources in your company (what could be awesomer?).

  • You'll identify the most egregious customer issues on your website (often all the low and medium hanging fruit, quickly).

  • They can be used very effectively early in the website development process to find potential customer hurdles / deal breakers.

  • If you have an existing UCD program or hire an external company/agency. heuristic evaluations can reduce the cost of full usability tests by helping fix the obvious problems. The $900 an hour charged by the Agency can then be focused on hidden / really tough challenges.

Thing to Watch For.

It should be clear that I am a fan of the heuristic evaluation process. And it is every fan's duty to also highlight things to watch out for. Here they are:

  • Both single person led or group evaluations contain company employees, and sometimes usability experts, but none of them are the actual customers. Despite using best practices and our wisdom we might miss some subtle problems, even as we identify the obvious ones. Be very aware of this.

  • [It follows from the above point that.]  The better you are at step #1 outlined in the heuristic evaluation process, the better your outcomes will be.

  • When there is disagreement in recommendations from the heuristic evaluations there can be great value by doing live website tests or usability studies (whatever is faster, usually experiments).

  • Heuristic evaluations are best for optimizing work flows, larger more obvious parts of website design and the overall usability of the website.

In summary: Not quite God's gift to humanity, but rather the best thing you could do to identify the low and medium fixes to your site that will significantly improve the experience of your customers.

Don't spend your day immersed in Google Analytics and just the "What" analysis. Understanding "Why" is the key, use it to unlock actionable insights.

Ok your turn now.

Have you done heuristic evaluations for your website? Who leads them in your company? Do you have a list of usability best practices that you use on your website? What other methods of of listening / collecting voice of customer / answering "Why" do you use in your company?

Please share your feedback / critique / questions / answers via comments.


Couple other related posts you might find interesting:


  1. 1

    Very cool post Avinash, I like your books, articles and pretty accurate style. Everything clear and with good usability.

  2. 2

    Just finished my thesis on the subject of using Web-Analytics in Website-Usability-Evaluation (using "Web-Analytics an hour a day" as one of the references). While I wholeheartedly agree that you should not spend all your time and brains on the "What" that web-analytics tell you, I would personally tend to recommend skipping the heuristics and going with Usability-Tests for the "Why" part right away. It seems that through Heuristical Evaluation a lot of problems are found (which is one of the things Nielsen advertises it with), but I usually don't want to fix a lot of minor problems just because it's easy. If I'm evaluating on the cheap then I'd rather find easy to medium fixes for the most severe problems. If you want to test fast and cheap, go take a look at what Steve Krug has to say in his new book "Rocket Surgery made easy" (http://sensible.com/rocketsurgery/). It's not that much more expensive than the $50 you spent and as you already mentioned it takes away the arguments over what works and what doesn't.

  3. 3
    Laura Schulman says

    I am a User Researcher, and a fan of this blog, I am thrilled to see you cover Heuristic Evaluations. It is a rite of passage for most User Researchers and it has been a pet peeve of mine that more people don't use them.

    Stressing the non obvious benefits was insightful. Some people skip the heuristic evaluations and move on to the more complex UCD methods. It is important to understand the subtle benefits that you have outlined, getting a cross functional group in the company involved and just a couple of hours to finding all the embarrassing issues on the website.

    Great post.


  4. 4

    So pleased to read this… Two of the great strengths of this approach seem to me to be

    1) Speed: you can shortcut through the agony of endless reporting and concentrate on what needs fixing and how to fix it.

    2) Involving the team in the process. 'Bitchfest' = highly engaged. Avoid loads of communication bottlenecks.

    I particularly like the team aspects. The lunchtime session seems a great way of doing it.


  5. 5

    Glad someone else is doing heuristic evaluations too! Don't forget Jakob Nielsen's heuristics as another option:


    It also helps to score each heuristic by impact to user, and ease of implementation to fix.


  6. 6

    Great topic.

    About 10 years ago I took a class on Heuristic evaluations and web usability design from a company called Human Factors International. I received a book with that class that I continue to reference today. Lots of great nuggets of information. For instance, while it's very popular to create dashboards with green and red lights as indicators of an item's status, a percentage of people are color blind and these hold no meaning to them. They suggest using some sort of symbol as well as the colors to convey status meaning.

    There is real value to using page mockups for heuristic evaluations prior to actually developing a site to weed out issues prior to starting development. Once a site is fully developed it is far more costly to go back and fix things. Try to catch & fix problems up front. Of course, we all don't have the luck of having been present for the 'birth' of our websites.

    Lots of great pieces of information and inspiration in your post, Avinash. Good job!


  7. 7

    Hey Avinash! Great post. I have seen something similar to this from different companies. I would say this was always the process I saw:

    1. Understand how the visitor sees the site. This means including data from analytics (assuming you have it) on top landing keywords, screen resolution, etc. to help place us in the visitor's shoes.

    2. Walk the walk. Go through the website with this information and ask yourself, if I were this person, what questions would I have? How would I try to find the answers? Would I be successful?

    3. Success is measured by both metrics and the "tester experience". We may feel that the client has a great average time on site, but if we feel the answers still aren't there, improvements must be made.

    4. Suggestions and data are presented to the powers that be. The analytics departments pray for changes.

    The main challenges I have seen with this are:

    1. Keeping it from being random pin-point opinions from the testers. i.e. colors they don't like, content that they believe doesn't read well, etc. This we try to avoid by having multiple testers and starting with user data from analytics.

    2. Finding the sweet spot in data to present both internally and to the client. (Obviously this is bigger than just this particular case!) Data that we need internally may seem irrelevant to the client, whereas data that we include for the sake of explanation to the client may seem long winded and less targeted internally. Internal data = instruction manual with a cause; external data = "you need to do this because of this" selling strategy with data. Finding who needs which data and presenting it without duplicating our efforts can be a challenge.

  8. 8

    Avinash – I'm glad to see the healthy dose of caveats in your post today. An unrestricted group heuristics session is just another name for a bunch of HiPPOs!

    That said, my one experience with heuristics came out of a previous employer. We identified a list of about ten significant issues that we wanted to fix and presented it to our higher-ups.

    The higher-ups agreed with the value, then hired a firm to perform the same study and produce the same results. It's funny how heuristic evaluations can yield much of the same results of their much more expensive cousins (e.g., usability testing).

    Finally, I'm a big fan of "Dr. Pete." It's awesome to see how good people and resources always seem to come to the surface on the web.

  9. 9

    Avinash, Another solid post – that again seems so intuitive to me I just assume we all do these things.

    The last analyst I hired had zero experience in the Digital Analytics. Before starting to teach him anything on our tool set, or about the #measure space I literally told him I did not want to see him for the first two weeks. Instead I gave him a few different consumer scenarios around purchasing a vehicle to then go and try to research and buy a car. I told him to use what ever sites, resources, go to car lots, visit the dealers, other dealers, what ever a normal consumer would do (i.e. do NOT limit it to just your own sites) only then would we start his new job.

    That was a key lesson and source of insight that paid off repeatedly i.e. Don't just look at your site – your consumers definitely are not – so how can you really understand them, their time spent, frustrations, what works/does not, etc. with their full experience & not just when dealing with us?!

    Only then with that mindset did I teach him to be an digital analtyics pro. He has since become well recognized for his knowledge base and strength in this field. We often looked back at how that initial mindset really set him off in the right direction and perspective of think like the customer, speak up on behalf of the customer through the insights we have, etc.

    Thanks again, Mark

  10. 10

    Stefan: Congratulations on completing your thesis! And on web analytics no less, I can only imagine how tough it was.

    I agree with you that the cost and effort required to do usability tests is coming down with every single day. Just recently I started to use http://www.bounceapp.com and http://www.pickfu.com as sweet ways to get VOC and use innovative technology to do user centric research.

    But I'll disagree a bit on skipping heuristic evaluations all together. People in the company have the best context of what the company is trying to accomplish on the site and the perspective they can give is unique and valuable. I have also consistently found that heuristic evaluations are 5x faster than even the fastest usability studies. Of course, as I mentioned at the end of the post, you also only identify low and medium issues.

    What I love the most about these evaluations is they get company employees to actually use their own website, this is rarer than it should be, and build a little passion for it across a larger group of people.

    As with all things in our life it is a prudent And strategy and not a Or choice.

    Thank you again for your kind perspective and for recommending Mr. Krug's book, it is wonderful.

    Tim: In your comment you have hit on two of my top reasons for loving heuristic evaluations!

    Evaluations led by experts and done by just one person (an expert or a novice) are fine. But it is the two reasons you mention that are often under appreciated.

    Tara: The four step process you have outlined is indeed very desirable, sadly is rarely occurs. There seems to be invisible boundaries between quant and qual people. The first does not feel their job is to every look at the site, rather spend days in Google Analytics and Omniture. The latter feels their job is to conduct usability studies (or other UCD approaches) and present recommendations / slides with no clickstream context. We all know that should not be the case, and yet it is.

    Hence the stress on collaboration and the group heuristic evaluations.

    Thanks for adding the two challenges, they are most definitely things to watch out for (hence the super magnificent importance of step #6 in the process). I love the distinction between "instruction manual with a cause" and "you need to do this because of this". Awesome!

    Josh: Thanks. Step #1 is key. If the tasks and experience are based on what 1. customers are expected to do (company perspective) and 2. customers are actually doing (customer perspective from surveys etc) then it reduces the chances that this becomes a runaway train.

    Post step #1 if people want to focus on colors and buttons then so be it. I think in groups, lead right, important matters come up pretty quickly and silly ones are relegated. Steps 5 and 6 then ensure everything (including button fixes) are prioritized optimally. :)

    Totally with you on group heuristic evals highlighting similar stuff to expensive agency lead exercises. Low and medium fruits are obvious. They can find it just as easily as we internal folks can! Hence make them, Agencies / Usability Firms, earn their keep by identifying and solving this subtle non-obvious issues.

    Mark: Excellent point! I hope that others will find inspiration from your story and help others start down a similar path (to hopefully similar stellar results!).


  11. 11

    These are great points. It's kind of what I call the Annoyance Factor. Does a website tick you off? If it does, whatever is ticking you off has got to stop. Like … on job search websites, when I am asked for my middle name.

    I don't use it for anything. My name is unique anyway. And even if it wasn't, I'm being identified on the site by either my login or my email address (often the email address fulfills both purposes). So, what the heck does anyone care what my middle name is? Even if it's not required, it's still one more thing for me to scroll or click past. That may not seem to be a lot but it is a small annoyance factor.

    Another one — why not ask for my ZIP code first, and get my state and city via ZIP code lookup? ZIP codes are decent database keys. Why not use them? Brighton, MA is 02135. Even if you call it Boston, I will still get your letter. And it's a lot faster for me to type 5 #s than to type 8 city letters plus the state abbreviation. Again, a small timesaver, but a real one.

    Lower the annoyance levels!

  12. 12

    @Mark great idea. Customer centric analysis.It's more of a reason why analytics really should be owned by the marketing department.

  13. 13

    Great post Avinash. We have been trying to firm up our usability evaluations and your post provides a nice framework for doing just that. I've used Dr. Peter Meyer's 25-point website usability checklist before and it's nice to know, as Josh B. stated above, that the resources we choose have the seal of approval from our peers.

    Thanks again!

  14. 14

    Hi Avinash,

    An interesting point, and probably the first I haven't agreed with wholeheartedly – it had to happen at some point :)

    My arguments against this form of usability testing are two fold. Firstly, it does become a 'bitchfest' and that people try to settle scores with one another. Given every departments want some representation on the home page this sounds like a wonderful opportunity to make the case just one more tome.

    Secondly, internal folks are not representative of real, walking, talking customers. Internal people often look at the website day in, day out. Few of your customers will do the same. I think what you want from usability perspective is a fresh perspective and I am not convinced you get that here.

    Anyway, thought I would add my two-pence worth.

    Keep on keeping on.


  15. 15

    Dan: Both your points are very valid, and excellent observations about the process.

    If the culture in the company is so corrosive that people turn on each other than this is not a great process to follow. Though I am uncertain how in that case even formal usability will help, the culture still promote score settling. Perhaps going directly to live a/b or mvt testing is the path (even then the culture might wreck things).

    The second one I address in the blog post in the "Things To Watch Out For". It is not walking in the customer's shoes. My hypothesis (and that of others who promote this agile method) is that most low and medium hanging fruit will be caught by anyone, including a non-customer (ie you and me!).

    Some nuance might be lost, certainly the "high hanging fruit" will be missed, but then that was not the goal. You and I were going for "fixing the most simple broken things as fast as possible while building company camaraderie".

    Thank you so much for highlighting these two issues, caution is certainly advised!


  16. 16

    Excellent post. I regularly do heuristic evaluations in my current company; in fact, I consider this an integral part of any deep-dive analytic analysis. Without it, you're just spotlighting issues without truly identifying "why" these are issues, and what can be done to remedy them (which is where analytics creates an ROI).

    The 25 point checklist in the post is good… but I prefer Jakob Nielsen's 10 heuristics for user interface design; it's simpler and right to the point! (http://www.useit.com/papers/heuristic/heuristic_list.html)

  17. 17

    Great post! The bitch-fest is a nice twist I haven

    Great post! The bitch-fest is a nice twist I haven't heard of before.

    One thing to keep in mind when working with heuristic methods though, is that they tend to produce more usability-problems, than traditional user-tests such as the think aloud test. So when working with heuristics, the test-leader should be aware that there is a danger that you'll end up using the money you saved from not doing a full test to fix problems that aren't really there.

    If you like more info on this, check out Niels Ebbe Jacobsens Ph.D-thesis, where he compares different testing methods.

  18. 18

    Thanks for the Great post!
    I have been learning many things by reading your blog. Heuristic Evaluations is an excellent concept for a better analysis.
    I have just bought web analytics 2.0 and finished 1st chapter :-)


  19. 19

    Great post, I just finished reading your book (WA 2.0) and this is an add on to it with exact steps defined, something I can start using with my clients immediatelly, with the checklist.

  20. 20

    Very timely post for me…just preparing a usability training deck for an analytics team.

    In the field of usability it is often said and proven that 95% design problems can be found with only 10 users.

    Thanks Avinash for another great post!

  21. 21
    Tyler Greathouse says

    Great post Avinash. I agree that the communication of findings to HIPPOS can be the most difficult part of the evaluation.

    One approach I found useful was a framework similar to the Nielsen Company's Brand Association Map. Just used a simple 1-10 scale and ranked each suggestion on:

    1) time to complete and

    2) cost to complete and then just inserted them into the target framework

    This reduced any digestion the HIPPO had to do. The ranking could also be based on increase in Net Income or Admin capability or whatever.

    Team heuristic evaluations have been hugely successful in my experience – Above just simply pointing out the usability issues they provided opportunities to put pro-usability people in the same room with anti-usability people and really start a dialogue about what the team could do to improve user experience. And we even converted a couple of the naysayers!

  22. 22

    Hi, it is a Great Post!

    I have a similar heuristic evaluation called engagement evaluation in my current company. I do agree with you about the difficulty to use this evaluation to report finding and generate recommendation.

  23. 23

    Hi Avinash,

    I remember you once said (and I agree) that your site's visitors don't all have one single behavior, a thousand visitors come to your site, each with their own intent, expectation and behavior, and that's why segmentation is important.

    My question is how can a few individuals represent why our visitors do certain thing on the site?

  24. 24

    Michael: You are absolutely right that our website traffic does not behave like a monolith. Hence segmentation is the key to any success, small or magnificent, that any decent Analyst hopes to achieve.

    But there is a difference between segmenting down to every single visitor and segmenting down to micro segments – i.e. small groups of individuals who share a common Acquisition Source, Behavior (intent) or Outcome. This helps us identify monetizable sizes of traffic, understand and meet their needs and achieve scalable success.

    The problem with segmenting down to individual people is that you (/your company) can't possibly find enough time, resources and effort required to understand every single person who comes to your site and then react to them at a individual level and then have a desirable outcome (for the person and your company). Hence, for now, this individual stuff is a bit of a oversell / unachievable goal (and hence inadvisable).

    Systems and artificial intelligence in the long run will solve that problem. But for now focus on micro-segments and you'll win big.


  25. 25

    Good article, couple of typo's if you drill down on the details.

    Its amazing to think how many teams in the world are building/running websites.

    Prototype vs playground.

    Guys should we make something new here?
    Girls should we build it like this page eg….

  26. 26

    Excellent post, thanks!

  27. 27

    Hi Avinash!
    First I want to thank you for writing the book Web Analytics: An Hour A Day. I read it every word in the book in about 5 days.

    I have been out of the internet since 2001. I was a web marketing consultant for 6 years.

    In 2002 I open a real estate company and did well until 2008. Since then I had to close down my business and went through all my savings and had to get government help to provide for wife, 3 kids, and myself. I was at a very low point in my life. I am 34 years old.

    After applying to over 200 jobs I realized there was not too many jobs for very motivated ex-business owners with out dated skills in the internet.

    About 2 months ago I started to improve my skills in internet marketing and came across your book. It has made a hugh difference in my life even though things have not changed financial yet for me.

    I want to ask some advice:

    I have developed a few products with some other people in the nursing field. They are training specific products and want to know what are the best things to track for a small business web sites that may only have 10 web pages and the main object of the site is to education and sell an information product to nurses.

    Also any other advice would be helpful.

    Thank You again,


  28. 28

    Gonzalo: Use something like the 4Q survey to measure Task Completion Rate, if people are being educated based on the content then that rate should be high. See this post:


    Use a tool like Google Analytics or Yahoo Web Analytics to track where people come from (keywords, referring sites etc) to figure out what they are interested in and how to make the site better for them. See this post:


    Finally identify a Macro conversion (a lead, a download, something) and two Micro conversions (repeat visits or downloads or phone calls of whatever). Set these as Goals in GA or YWA and track them. See this post:


    All the best!


  29. 29

    Thanks for one of the best posts on the subject. Jacob Neilson's 10 golden rules ( http://www.useit.com/papers/heuristic/heuristic_list.html ) has been a great resource, but your's take this further and gives us a solid approach.

  30. 30

    Great post!

    I'm evaluating a web application and was wondering if this same type of heuristic evaulation can be used? If not, is there a heuristic evaluation specific for web applications?

    • 31

      Terrance: The broad structures would be exactly the same. Though some of the elements of the 25 point checklist might be different.

      So follow the spirit of the recommendations, and the exact steps of the process outlined in this post. :)


  31. 32
    David Larson says

    Avinash: As always, a thought provoking post. I'd like to use a more quantitative methodology to augment our cognitive walk-through approach – both to standardize the work, and to provide a metric for comparison and prioritization purposes. We never used a checklist for the reviews, but instead relied on the expert to recall the heuristics.

    As I did a search for heuristic check lists, I found that either the list was very short (10-25 items), or very long (200-250 items). For example: on your recommended list of 25 items, "search" has one heuristic "Is it easy to access", while on this list (http://www.userfocus.co.uk/resources/searchchecklist.html) there are 20 heuristics for search alone – the whole list has 247 heuristics. Is there a law of diminishing returns with this type of effort?

    I would feel remiss if I reported that a review of the site found no issue with search, when I didn't systematically evaluate the search result page.

    In the end, its not that every one of the 247 heuristics aren't good heuristics, but if the method is too time consuming, or too expensive, it isn't going to be used.

    Any advice on striking the right balance?

    – David

    • 33

      David: Really great question. And is there ever a end to how much we can know? :)

      Part of knowing where to stop is the art of knowing when you've reached the diminishing margins of return. Part of it is knowing what the big broken pieces are or what the actual fixable reality is. Part of it is knowing what will make the customer (or your boss) happy.

      So my approach in these cases is to collect a comprehensive list, then distill down to a small clump of 25 or so that will help identify big things based on what the business/customer values. Then do those. Once that's taken care of, expand to more questions from the list. Then more. So you keep evolving.

      Please consider the list you see in this post as a good starting point for most people. I was not trying to be comprehensive.

      Hope this helps a smidgen.


Add your Perspective