Web Analytics Success Measurement For Government Websites

Prickly Problem If you know what the desirable outcomes are from your website, it is not hard to measure performance of the website for you and your customers.

Measuring top line success of ecommerce websites is not very complicated, all the sweet revenue based outcomes are there (at the least).

Measuring non-profit websites is a bit complicated, but not really all that hard because we can, with a small amount of love, figure out outcomes to focus on (donations, # of sign-ups for the protest in DC, # of petitions signed, volunteer applications, etc).

Measuring government websites is a bit more complicated, if for no other reason than that it takes a pinch of effort with a dash of imagination to figure out what one is solving for. What are the desirable outcomes one can focus on to measure success?

The above question came to mind from a kind note I got from Ines Jans who is a part of the team that is responsible for www.belgium.be


Ines and team were just starting to think about analytics (because the love their customers!) and asked for some thoughts.

My first question to Ines was, (surprise!):

Q: Tell me a bit more about what your site does, like what are the real goals (or give me some ideas about it) and what challenges you face, what do you expect people to get out of it?

[Best Practice: Always, always, always start any measurement conversation with the above inquiry. The answer will be key to insights, and without it you'll simply be a glorified Reporting Squirrel.]

The answer, which might fit most government websites was:

A: The goal of our site is to be a portal to all the official information there is about Belgium and make information easy to find. Visitors should be able to figure out which Ministry is responsible for what tasks.

[Best Practice: Don't be surprised in your Analysis Ninja quest if you get answers that just start the conversation, rather then give you a prescription for what you need. Squirrels will despair here, but Ninjas will take clues from what they hear and visit the site and come up with a set of important measurable outcomes.]

Based on the answer above and some time spent on the English language site as well as those in other languages (Google Translate!), I came up with the following five questions I could ask data to measure success.

~ Are Visitors able to find the information they are looking for?

~ Are the Visitors satisfied with their experience?

~ What is the most popular content on the site? What area can we prioritize higher than it currently is?

~ How long does it take for someone to find key information they want?

~ Does the right information actually exist on the website? What major things might we be missing on our website?

Let's take each of these questions one at a time and figure out the best way to answer each using a true Web Analytics 2.0 strategy.

Q1. Are Visitors able to find the information they are looking for?

Given the singular purpose in life of this government website is to be the one stop shop for all the information one could possibly need, it should be pretty obvious that the very first, and magical, thing we would measure is if Visitors to the site are able to find what they might be looking for.

So would you use a web analytics tool?

Here's the first surprise: No!

There are certainly tertiary ways in which you can answer this question using Omniture's Site Catalyst or Google Analytics or other wonderful web analytics tools.

website task completion rate But the best way to answer this question?

Ask the Visitors!

Using a simple survey that pop-up on-exit (when Visitors leave the website) you can ask your customers to tell you if they were able to complete their task. No interpretation required.

4Q from iPerceptions, available in 18 languages, is a free on-exit survey you can use. If you don't want to use a external survey build your own, ask four questions, analyze the data for:

"Were you able to complete the purpose of your visit today?"

The answer to this question becomes the #1 Key Performance Indicator (KPI). You are going to watch it like a hawk, you'll post it on all your bulletin boards, you'll set up custom alerts to ensure that your team gets a small electric shock every time this number drops below 65%!

The overall number is good enough, but the data that will be awesomely actionable will be, if you use 4Q: Primary Purpose by Task Completion Rate. . .

primary purpose by task completion rate

You see the second question in 4Q is "Which of the following best describes the primary purpose of your visit?" and a standard report in 4Q will paint the above picture.

Now you not only know if people find what they are looking for, but you also know which tasks are hard to complain.

You need to fix "Complain about the French" : ) because the Visitors are already upset and only 5% are able to complete their task, resulting in them becoming even more mad!

Remember: You don't need to show the survey to everyone who comes to your site. You can sample just a small percent of your Visitors. You only need 300 responses in a month to get a statistically significant sample of data, and 1,200 if you want to do segmented analysis.

Q2. Are the Visitors satisfied with their experience?

HiPPO's (the "highest paid person's opinion") in the organization, even in the government will love to have a more direct (than task completion rate) answer to the question: Are our Visitors happy with our website?

That's were it is prudent to measure Customer Satisfaction.

4Q and other surveys of course measure that quite easily: Based on today's visit, how would you rate your site experience overall?

Measure it. Trend it. Report it. Correlate the trend over time with changes you have made to the site and identify insights (any causal connection between site improvements / campaigns and customer satisfaction?).

An alternative, or additional, way to measure satisfaction is to count and analyze the Contact Us submissions. . . .


Start with the number of submissions. Trend over time.

Drill down into the type of complaints and do atleast rudimentary sentiment analysis (i.e. read & categorize) of actual messages to gauge customer satisfaction.

Remember: When you do surveys you don't have to torture your Visitors with billions of questions! In researching this post I went to US government sites and I got a ugly 34 question on one single page looooong survey. 34 questions! Most were irrelevant. I would have answered a few, but this showed a fundamental disrespect for your customers. In the end your Visitors are upset and you suffer from a lack of data.

Only ask what you can action.

Q3. What is the most popular content on the site? What area can we prioritize higher than it currently is?

It is not unusual for content sites to produce content. It is even less unusual for them to produce content that they think potential visitors to the site might want.

What is rare is the analysis of what visitors to the site are actually consuming on the site.

Here's a simple analysis I had learned from Tim Hart (who was with the J. Paul Getty Trust): Measure the distribution of content in each section of your website and the percentage of Visits to each section.

content vs visit distribution

On the y-axis is each of the sections on the belgium.be website. In blue is the amount of content in each section. In red are the percentage of visits where that content was consumed.

Is it not awesome! Insights galore!!

If this were their data, and it is not, it would be pretty obvious that there is huge interest in content about Housing and Economy tiny fraction of the site's content is about Housing and Economy.

The balance for Family is a lot less lop sided.

While the government might love Justice, Mobility and Health (and boy do they love Environment!), Visitors to the site are a lot less interested in those pieces of content.

Action? You know what people want, how about giving them more of that content?

When it comes time to prioritize the next set of web pages or videos or podcasts, how about giving higher priority to those big red lines?

Sweet right?

You can also do segmented version of this analysis, see what Visitors to English, Dutch, French and German sites prefer. Or within Family what group of content do people like. Etc etc.

Two more ideas to get into your Visitor's head. . . .

Measure Downloads:

There are a ton of downloads (pdf's mostly) on the belgium.be website. Forms, applications, useful guides (like how to marry a belgian or how to prepare for your first job) etc.

It is a trivial cost, analytically, to track these downloads using your web analytics tool. Do it. Measure what your Visitors are most interested in.

tracking downloads

Yes, yes, yes I see my technical squirrel friends raising their hands and saying you can only track that someone clicked on the download link and not that the download was successful. I know.

For our analysis here just intent is fine.

In fact unless a vast majority of your Visitors are connecting using dial up it is safe to assume the download of small files went through. I know that does not make the squirrels happy. I am sorry. You keep squirreling while we make decisions about how to improve the site.

Outbound Link Tracking:

Another thing you'll notice about the website (see why it helps to surf a site you are supposed to analyze?) is that there are a ton of links on the site that point to other government websites.

Track 'em!

outbound link tracking

Of course the above is not their data :), it's just an illustration of how absolutely easy it is to track this data.

From the report it is very easy to then figure out what links your Visitors click, which is a great, positive, indicator of the fact that they found what they wanted and also what they were interested in.

Remember: It is not very hard to do any of the above three types of analysis. All you need to get into your customer's head is move away from "Top Pages Viewed" and "Page Views Per Visitors" and think a bit more creatively.

Q4. How long does it take for someone to find key information they want?

There are some pieces of content that are so darn important that they are heavily linked (say latest news in case of belgium.be) right from the home page, or that you really do want people to find them asap (in the Health section for example the pdf about how to deal with H1N1 virus in belgium).

For these important pieces of content measure Average Time To This Page.

average time to this page

That's almost three minutes from the time that someone entered the website to the time they found this page (say the one about swine flu).

On average people give a page two and half seconds before they click/leave. Consider how long three minutes is, and how many people might have given up in the process of finding this key information.

Unfortunately not too many tools, Google Analytics and other Paid Solutions included, provide this as a standard metric. I use ClickTracks and that this delightful metric as a standard offering.

I wish others would have it.

Remember: You can use this data to ensure that your best information is found by Visitors to your site quickly. Fix your top / left / right / bottom / whatever navigation you have on the site. Consider creating a prominent "box" on the top right where you "merchandize" these important links. More things like that.

Q5. Does the right information actually exist on the website? What major things might we be missing on our website?

I have consistently advocated my love for internal site search analysis. It is simply da bomb!

Like many other sites belgium.be has a internal site search engine. Typically Visitors who have a harder time with normal navigation (or limited data on a page) will make liberal use of this site search box.

Why not use that data?

internal site search analysis

Again, this is not their data :).

I recommend looking at the top typed search terms by the Visitor but then also looking at the metric: % Search Exits.

That's the bounce rate of your search results page. I.E. People come to belgium.be, search for the term hippo and the search results are so bad that 33.33% of the people exit from that page! They don't even bother to do anything. Just bail. Bounce. Kaput!

Now you know both 1. what information they were looking for, 2. what search results stink and 3. likely because you don't have the right or enough content about that keyword on your site.

Fix it!

I have one more idea to understand if you are missing information that your Visitors want on your government website.

Use Page Level surveys.

turbotax page level survey-1[1]

There are free page level surveys available or you can build your own (like the one above from a software vendor's website).

These can be an excellent way to understand what content is missing from your website. You can of course also use the open text voice of customer (VOC) from surveys like 4Q, look for Visits with Task Completion = No.

Remember: These surveys don't collect any personally identifiable information (PII) information, and that goes for your web analytics tools as well. Many government sites are extra concerned about privacy, as they should be. Do Please familiarize yourself with the privacy policies of the vendor.

Note: What was not tracked or emphasized. . .


Unique Visitors.

Page Views.

Time on Site.

And so many more mundane and perhaps more "famous" web metrics.

I am sure most government or normal websites jump to that first. And why not, they are all staring you in the face when you crack open any analytics tool.

The problem is that these aggregate metrics barely contain any insight. If you focus on them, you'll be left holding a empty bucket / cry a lot / get fired / not get your government pension / dread meetings with your boss.

I hope the above ideas inspire you to do more, go beyond the obvious and less than useful.

One last quick example. . .

You can use the same strategy for other sites. Though remember the job the site is trying to do and the desired outcomes will decide which key performance indicators you end up using.


For example for www.recovery.gov in addition to some of the metrics above I would probably also measure Visitor Loyalty and Recency. That's because the government wants the data provided to be so sticky, and it is updated frequently, that it wants you to come and check it again and again.

In this case perhaps more than downloads I would also measure # of customized graphs created. When I measure content consumed (#3 above) I'll probably focus on understanding which departments get looked at more on the site (are they the ones most spending money?).

You can also bet I am going to be totally on top of reporting how many complaints we have received on the site for Fraud, Waste & Abuse! Getting a ton of those would be a key performance indicator! : )

Makes sense?

Don't despair just because you have a government site. Ignore the obvious. Focus on the site's jobs. Identify key outcomes. Do productive analysis.

Good luck.

Ok now it's your turn.

Are you responsible for a government website? What are your key performance indicators? What web metrics are important to you? Do you use any of the above strategies? If not, why not? Have you looked at www.belgium.be? What would you have recommend that I did not?

Please share your valuable advice / insights / feedback / critique.


PS: Like this post? Perhaps you'll consider ordering my * new * book: Web Analytics 2.0.

Couple other related posts you might find interesting:


  1. 2

    Hi Avinash,

    as usual I enjoyed reading your post very much. I have one question regarding the analysis in Q3 though:

    How to measure the distribution of content?

    The obvious answer to count distinct URLs but this does not work when some parts of the website offer long texts, tables and pictures on a page while others look more like a directory of sorts (short page, little content).

    Measuring the number of words over all pages of a section also breaks, when content is not in words but numbers, images or multimedia.

    So why not measure [%visits] against [pageviews]*[avg. time on page]?


  2. 3

    You stated : "Measuring non-profit websites is a bit complicated, but not really all that hard because we can, with a small amount of love, figure out outcomes to focus on (donations, # of sign-ups for the protest in DC, # of petitions signed, volunteer applications, etc)."

    I strongly agree and designed a useful KPI related to visitors' loyalty. In e-commerce websites, RFM segmentation is used where "M" relates to Monetary. When it comes to non-profit websites, RFM would change into RFV where "V" stands for Volume of pages viewed. This KPI is a relevant leading KPI to monitor loyalty.

    Best regards.
    Eric Hobein

  3. 4

    This is an incredible post Avinash. While it was written for Government websites, I found it useful for my work in ecommerce websites.

    We very rarely think of the non-ecommerce component of our website and this helps share a framework that we can use to judge success of all other things than direct conversions.

    Keep up the great work.


  4. 5


    Very interesting, as always.

    I'm a 23 year old marketing manager at a company that runs 3 very large wiki sites, so analytics have become absolutely central to figuring out

    1) where visitors (I focus more-so on non-registered, new visits) are coming from and/or with what keyword search,

    2) where they are landing,

    3) if they convert (sign up to be a free user/contributor), and

    4) what they contribute to the site, be it in forums, comments on videos, or best yet, expanding our wiki.

    The one thing I've noticed so far is that there is, at least for my sites, and almost untamable amount of data to look at. Your entries definitely help me find some focus in my work. Much appreciated!

  5. 6

    Hi Avinash –

    As usual an excellent and thought provoking post.

    Excellent point mentioning about internal site search analysis. Some government sites leaves a lot to be desired in terms of site navigation that it is almost second nature to head to the search box.

    Same question as in one of the previous posts as how to measure content ? I understand the intent to matching content vs visits to see where the mismatch is. I am thinking by content you means something that reflects effort of website owner in terms of creating the individual sections but cannot wrap my head around as to what this measure would be in a pragmatic sense.

    Thinking about the previous post again I feel visits and duration (page views* time spent) are both some forms of engagement that is reflective of outcome and probably correlated to some extent also.



  6. 7

    Nice post!!

    All the SEO lessons now ned to be redesigned :)

  7. 8

    Solid information and easily understood. Q3, on measuring visits and content is the Bomb. Thanks!

  8. 9

    I'd just like to attest to the glory of 4Q. I witnessed an entire change in project scope because the results of our 4Q survey told us customers weren't interested in what we wanted to do. They just had a few basic needs that we weren't meeting. Phew, data to the rescue.

    Ask the customer. How easy is that? I'm so looking forward to your book coming out, Avinash. Thanks again for the great information!

  9. 10

    I wish I were still working at a government agency just to try all of this out! I would definitely try to use that Q3 tip in other areas as well.

    The Grok also posted on how to help non-profits know what to measure (http://www.grokdotcom.com/2009/07/29/turning-web-analytics-into-nonprofit-success/), combined with this post it's a lot of great resources!

  10. 11
    Alice Cooper's Stalker says

    Avinash, Good post and interesting topic. I'm glad you brought up internal site search. I would have suggested using internal site search earlier for:

    Q1. Are Visitors able to find the information they are looking for?

    You can have an internal site search report that offers up search terms that resulted in zero search results. This could give you insights into content that people are searching for and not finding. I think that the report you suggested with % Search Exits is a powerful one…one that I hadn't thought of before and that I will try to recreate with my tool.

  11. 12

    Eric: Great suggestion! I think RFM can be modified to non-profit websites. V (volume of pages viewed) can be a good alternative.

    In the past I have written about computing "economic value" of our online activities. In that sense perhaps we can keep the M as the "monetary" but we would compute Economic Value rather than direct online review. Food for thought.

    Ole: The most and easy direct source of this would be your website's CMS (content management system) / platform. While most CMS's lack sophisticated reporting they are able to provide a ton of "content count" reports.

    For example even WordPress, my blog's CMS, allows me to easily know exactly how many of each type of posts I have (how many about qualitative analysis, paid search, career advice etc etc). I can easily take that data and see % visits from my web analytics tool.

    Sometimes you'll have to get your IT team and Content Producers to start tagging the content they produce, that will get you the data.

    If all else fails just ask them to do a count of content pieces in each directory and use that (and remember you can also easily report on directories and % Visits of % Page Views in your web analytics tools).

    Remember you can't really measure % Content using your web analytics tools because for your web analytics tool to know you have a piece of content it has to be viewed atleast once in a web browser. Lots of your content might never be viewed by anyone, hence that will be missing from your web analytics tool and your analysis will be imprecise.

    Kailash: To your second point, page view and time spent have some marginal value (so they are not totally useless). The challenge is you'll have to overlay so much of your own opinions and interpretation into that data that any resulting insights will have, well, marginal value. When you have cooler and more impactful stuff you can measure, why not? : )

    Rick: I would agree with you atleast a bit, in the sense that Site Search would be my first clickstream source.

    I have to admit I am still biased towards asking the Visitors first, before I overlay interpretation. Other than that small difference, we are in the same boat! :)


  12. 13

    excellent post!

    as for the answer to Q3, i take the most popular content report with caution. sometimes, the most visited content is not the most interesting for the visitor, but the one that's more prominent on the homepage, the one that's easier to find or the one that got very good rankings on google.

    also, just because a content is visited a lot, does it mean you need to produce more of it? maybe it's perfect just as it is, and that's precisely why it's so popular… i rather place that content on a more prominent position of the webpage, for instance.

    using top content reports without a contextual analysis or more qualitative metrics can lead to the wrong decisions…

    what you think?

    see you tomorrow btw!


  13. 14

    What a surprise to see the Belgian governmental portal in your post! :-)
    Funny to see that you spend time analyzing it – you probably explored it more than I ever did. Hope your survived the experience. Belgian government organization is not known to be the simplest one – even for us Belgians :-)

    This is really interesting post – even if not working in that area. Some KPI's & advices can be easily reused for other site types.

    Also your idea of % of visits per content vs. % of site content is a very good one. I will certainly reused it. While it sounds so obvious, I did not think of it.



  14. 15

    Hi Avinash

    You said "From the report it is very easy to then figure out what links your Visitors click, which is a great, positive, indicator of the fact that they found what they wanted and also what they were interested in."

    Not sure I follow this. I don't understand why the links a visitor clicks can indicate whether or not they were happy with your content. Could you elaborate?


  15. 16
    Richard Warzecha says

    Sorry, but I cringed when I saw you discuss Bounce Rate. Here again is what you said: "That’s the bounce rate of your search results page. I.E. People come to belgium.be, search for the term hippo and the search results are so bad that 33.33% of the people exit from that page! They don’t even bother to do anything. Just bail. Bounce. Kaput!"

    I think Bounce Rate is one of the most misused or misunderstood metrics. Say, for instance, that the people found a very useful PDF on your site via a search engine, went directly to that page, and were completely satisfied with your experience. Isn't that a GOOD thing? Sure, it might be NICE to have them hang out for awhile, but it's not the overly disappointing experience you seem to portray. Certainly more indepth analysis is required.

    Also, good/bad bounce rates vary widely with the kind of page it is. High rates from your home page, generally bad, more detailed pages, generally not as bad.

    Finally, Bounce Rate is a metric which really needs to be broken down according to Source. Visitors coming in from organic search, paid search, referals, direct, can have much different behavior on your site, and Bounce Rate is one of the metrics which could vary the most between these sources.

    All in all, I really liked your discussion, though. Very relevant and insightful, I'm just often sensitive to Bounce Rate discussions. Don't we all have our favorite/least favorite metric?

  16. 17

    I currently work at the top level of a very large U.S. government website. While we'd love to track visitor loyalty and things of that nature, we can't due to privacy issues. By law, we are not allowed to use persistent cookies which prevents us from knowing unique visitors, loyalty,etc. If you've got an idea of how to measure loyalty some other way, I am all ears!

  17. 18
    Chad Henry says

    Hi Avinash,

    Maybe I'm misunderstanding you but I think your Q3 – content vs. visitors, and the insights you draw from it are slightly flawed. Just because "Justice" has high content and low visitors doesn't mean visitors are not interested in it. It's possible that the content is just buried 10 levels deep, it's not well linked to, the search engines haven't well indexed the content yet, the website search function isn't picking up a category for some reason or the topic is extremely competitive and the site doesn't have good SERP rankings and hence a lower visit rate for that topic.

    Or it could be that potential visitors are interested in the Justice topic, but the titles displayed in Google's SERPs for the content are boring the potential visitor and they simply don't click through to it.

    My main point is, just because a category has high content volume but low traffic volume, doesn't mean it's not what visitors what. It could mean there is something seriously wrong with the presentation or spiderability of the content. I think if people take action on "You know what people want, how about giving them more of that content?", without thinking WHY — why does this category have high content but low traffic?, — they could take their site content in the wrong direction without fixing the underlying problem.

  18. 19

    Pere: I encourage people to look for % numbers that have a huge mismatch (all in my graph), and do the analysis with the goal of being thought starter. As an example, it is difficult to imagine that 80% of your content is consumed by 25% of your site traffic.

    Michael: I am encouraged that Ines reached out to me from the Belgian government, it shows that they care and they are starting to be a lot more data driven.

    It was also fun for me to learn about Belgium. :)

    Paul: The thought is…. if on my website I have provided a whole host of links the only condition they will get clicks is if they are relevant to people who come to the site.

    Looking at that report, in the post, will help me know: what are the links that get clicked a lot, are they the ones we want clicked, if not why not, if yes hurray, etc etc.

    If people do not find relevant links (or find most links on your site to be useless) they simply won't click.

    Richard: I think the context in which my comment is key here. I said: "That's the bounce rate of your search results page." All other misunderstanding of bounce rate aside, you'll agree if I come to your site, search, and leave immediately after I see the search results page then that is not a good thing.

    We can work hard to find a scenario where that might be ok, maybe I search for your phone number and it is listed #1 on your search results page and I don't need to click anything and that is counted as a "search results page bounce" and that is a good thing.

    I doubt we can find a lot of those scenarios, hence my assertion that "the bounce rate of your search results page is of value".

    For the website or webpage bounce rates here are two posts that might be of value:


    Chad Henry: If the analysis gives you a pause and makes you rethink, then my job is done. :) Each person will react to the data differently, I would really really ask myself why we are producing all this content no one seems to want, an not one people want. But as you say, there might be some redeeming value to all that content.

    Mel: For now no cookies, no web analytics loyalty reports. Consider using surveys so and ask the loyalty question.


  19. 20

    Hey Avinash – sorry for the late comment but I was out all last week :)

    I really like this post because it focuses on the non-traditional type of website, which includes a .org or a Government website. A "normal" site would be to sell something or it would be to collect leads, so the challenge is tougher with these two to determine the reasons that these websites to exist in the first place…and if I may add to it, don't accept "To get more visits" as an answer (I've been given that answer – it doesn't help and shows that your client / customer needs to be schooled fast!)

    I also REALLY REALLY like the "Average Time to This Page" from ClickTracks. This can also show how effective or ineffective a landing page is for you, if it takes a short / long amount of time for someone to reach that key page of your site (if it's taking too long, something's wrong!)

    Finally – if you can read this comment, please go and buy Avinash's new Book Web Analytics 2.0. Buy his "Web Analytics: An Hour A Day" book also, why not. It's a right of passage to read both books to becoming a Web Analytics Ninja. :)

  20. 21

    Hi Avinash

    Very interesting points. I however cannot resist to make a case for the the good old page view in relation to content engagement.

    Now our work and clients are focused on the nordic region where government website have a very strong presence.

    Here we often find there is a huge difference to wheather the page views on the information page is "just" navigation through the content or if the visitor actually stops to engage with it.

    The key factor to add to the equation is time.

    To get an indication of this we normally recommend our clients with information objectives to apply a filter to excludes pages view of less than 6 seconds (or more… depending).

    This help exclude most of the navigational pages and bring focus on the content where the users actually engage with the page.

    The interesting bit is that this list of qualified page views normally is radically different that the normal list of top page views.

  21. 22
    Gina Campanella says

    I worry that the inverse might also be true. If you have a high amount of downloads on some PDFs and low for others – is that necessarily because of low interest? Or could it be placement or title of the PDF that didn't help visitors find it or want to click? I worry about that with a lot of our content too. We don't have search but we have a help center. The bounce rates might be either they found what they needed or they didn't like the answer they saw or worse they didn't think the layout of the help center was useful so they opted to call instead of search for the info. Without adding the survey's the analytics still leave us wondering what was the user's experience.

    any thoughts on that – ?

  22. 23


    I suggest that you do some A/B testing with your critical PDFs or content. Try moving them, changing their titles, or changing thecontent and see if your data changes. Then you can determine if it's due to low interest or a bad title/placement. I suggest testing the current version and new version at the same time since timing can affect results. Then compare version A to version B.

  23. 24

    Thanks for a great post, Avinash.

    A lot of people have already said a lot of things that I wanted to say, so i'll just stress on the usability part.

    Government websites are known to be extremely difficult to navigate (thanks to the extremely little usability testing that goes behind it) which goes back to the whole goal setting exercise.

    Do you just want to disseminate information?
    Do you want to foster participation?
    Do you want to drive citizen governance?

    The clearer your goals, the easier it will be to get there.

    Cheers! Great post again!

  24. 25

    Steen: The challenge with page views, as you highlight in your comment is that a heavy layer of out interpretation is required and while i am confident you will get it right, i think most people will not. Hence my elevation of some qualitative measures higher than the quantitative. The idea is to reduce our interpretation and directly ask.

    But certainly as a sub measure your wonderful suggestion might make sense for lots of people.

    Gina: You worry was the primary reason that the post starts by recommending qualitative analysis, primary purpose and task completion rate. With that in your back pocket you can get the context you need to make more sense of downloads – good or bad – and bounces.


  25. 26

    Great posting! I worked for an agency that developed the State of Colorado's website. And boy was that a tough one! While I wasn't directly involved I overheard many of the conversations with the client before and after the launch. Afterwards, the users reported back disappointment. Looking back, we should have done much more stringent usability testing before the launch.

    Here is an article you may find interesting, it reviews usability tools http://bit.ly/32mqlQ

    I am affiliated with usertesting.com but the article covers several tools.


  26. 27

    Hey Avinash,

    Here in the UK it is not quite as simple as all that (although that doesn't mean it shouldn't be). Our big three government portals are outsourced to third parties whose job it is to not only report what is going on, but also they have to report back to Government on whether they are doing a good job of running the website on behalf of the Government.

    With that in mind not only do we report on the recommendations of the COI website (http://coi.gov.uk/guidance.php?page=188) on metrics, but also create our own agreement with controlling departments.

    Therefore we go one step above asking the people on the website what they think of it. We find out what our target audience is, hire a research company and go and ask a large percentage (well, about 1%) of them lots of questions. This gives us some more nice measures:

    Market penetration
    Propensity to reuse
    Propensity to recommend

    We (because we're a Business facing website) also ask them if we have saved them any money and how much. That's an amazing thing to go back with – the information, tools and transactions that we give the users on behalf of the Government saved Businesses £X.

    It also gives us the opportunity to ask the companies what is important to them at the time, what they see as areas of opportunites, why they don't want to use us, etc, etc – lots and lots of insight!

    We also have a nice metric which works the other way around and is about how satisfied the Government departments are with us given how we engage them in the projects (a bit like you asking Google if they're happy with your work!).

    And of course we do all those other things that you mentioned up there, in various different ways and to various different levels. The key is getting metrics focused, not too content focused.


  27. 28

    Alec: Looks like the UK government has done a marvelous job by outsourcing the creation and running and measurement of success! Perhaps a model others can possibly emulate.

    You are collecting a lot of data and reporting back lots across a couple of hundred sites of the UK government. My minor thought, perhaps to others, would be to ensure they don't get caught the data trap, rather from the buffet of metrics pick the three or four (max) that are the Critical Few and hammer each organization with that.

    It is so darn hard to move people from consuming data to doing something with it, focusing on the Critical Few I have found is a good way to do it – if the Critical Few are *custom* to each site's purpose (desired outcome – which hopefully matches up with why people come there!).

    I commend you on the awesome job you are doing there, impressive!


  28. 29

    Hi Avinash

    Love Q3 also. It could be interesting to colour the content bars by the percentage of the content that was visited and the percentage that wasnt. I will be using this so thanks very much!

    Best wishes


  29. 30

    Let me echo the same concerns as Mel as I run a public US gov website. As per privacy issues we can't store a persistent cookie let alone a third party persistent cookie. For some exceptional cases we can save our own domain cookie. My question is whether this functionality ever coming to Urchin which we can download and install on our own servers. Also is it possible to have a hybrid solution specific to US gov clients to "relay" these cookie writes :-) OK I will stop dreaming now.


  30. 31

    Sri: I am not sure what you mean by "this functionality"? None of the analysis in this post used Unique Visitors as a function, which means you can track it all without using cookies.

    But to answer your question about Urchin, yes you can track all of the things in the post, and more, with Urchin and host it in-house inside your systems and firewalls. There are also log file parsers, some free, that will also get you some wonderful data without having to use cookies (they'll use available info in your logs like ip and user agent strings etc).

    I do want to stress, perhaps again, that there is a lot of analysis you can do without using cookies. The metrics you can't are unique visitors and, say, new vs returning visitors. Rest of your data is still fine, campaigns, keywords, leads, page views, etc etc.

    I wish you the best.


  31. 32

    I would encourage anyone running any website but particularly a government website to monitor whatever feedback/support channels they have (email, phone, chat, etc.) If the number one piece of information people are emailing about is not on your website, and can be, put it there. Worse, but easier to fix, is if that information IS on your website, they're not finding it, so fix how that content is indexed in your search/test placing it somewhere else. Think about how rarely you email a site about not being able to find something. We usually just move on! If someone emails/calls after visiting the site, they _really_ want that info. -Michael

  32. 33

    Excellent post, as allways.

    We're just about starting to analyze our municipal web site and even in these early stages it seems we will try to connect what we do on the web with what is happening in the municipal call center.

    Can we take some load off of the cc with better web content, and measure it as it happens, that would be something.

  33. 34

    Daniel: Here is a blog post on multichannel tracking that might have some ideas you can use to track the offline impact of your website:

    Multi-channel Analytics: Tracking Offline Conversions

    All the best!


  34. 35

    Where do you get the numbers for "you only need 300 responses in a month to get a statistically significant sample of data, and 1,200 if you want to do segmented analysis."?

    Statistician student here tells me that "this is a non-probability survey if we are trying to generalize result to all visitors to our site at anytime therefore none of our result are technically “statistically significant”".

    Would appreciate your advice on this.

  35. 36

    Hi Avinash,

    Excellent post, very helpful for my new role.

    I do have a few questions about this particular website which goal is to direct people to right content even though is to an external site.

    How to track if people found the right content on exit if they are linking thorugh another website on exit.

    Also on Q5 you mention to track exit% of search terms and of course in this site they are all exits as they are outbound links, they barely have any content on their own.

    I'm tracking outbound links on GA but cannot cross this report with Search insights on GA.

    Any recommendations?

    Thanks in advance.

    • 37

      Adrian: If you are doing out-bound link tracking then that tells you where people are going.

      If you work with a GACP (list here: http://www.bit.ly) then you could possibly do more advanced (custom) analysis that would measure: I came to the site. I clicked on an outbound link. But I came back to the site in less than two minutes (because the site I went to was awful / did not answer my question / 404 / whatever). Clicked on another outbound link.

      Depending on how you do outbound link tracking (if you use "fake page views", clearly marked, it might work better in this case), you can even see this possibly in the GA User Flow report.

      Beyond that, you can implement the Google Consumer Surveys on your site and get additional qualitative data for people who do take a little while to find what they are looking for on your site.


  36. 38

    It's old but comes exactly when i need it.

    I think Q3 is the most important report for content-centric site, and it really mind blowing for me.

    Thanks Avinash!

  37. 39

    Even 5 years later this post still resonates!

    I currently am working on a contract with Accenture and was just assigned to do some web analytics work for healthcare.gov. It blows my mind how awful the analytics are and how poorly the current team is doing getting actionable insights from the data!

  38. 41
    David Folkerson says

    Hi Avinash,

    It would be amazing if you could you go one step further with this post and suggest methods for quantifying goal value of government websites. We have goals, we have KPIs, we have targets, but to calculate proper business outcomes it would be nice to be able to assign an actual value to performance, even though our activities are public services and not designed to earn $$. Any advice on this front, or recommended reading? How do we assign relative value if there is no revenue or sales component to our work?



    • 42

      David: You can use strategy #4 outlined here:

      ~ Excellent Analytics Tips #19: Identify Website Goal [Economic] Values

      When I've worked with various government agencies, I've identified what digital can help drive when it comes to the overall mission of the agency. Sometimes it is recruiting new folks, other times it is shifting the brand perception, at other times it is reducing wait times, and on other occasions still, it is measuring the impact on the environment through digital initiatives. It takes a little bit of hard work to do this, but it is possible. You can higher a consultant who can help you.

      The only other thing you need is a little bit of patience as some of these things happen over time. So, going in, give yourself three months, six months to measure these complex things. During that time, you can definitely measure the normal Users, Page Views, Raw Micro-Outcomes etc.



  1. Tweets that mention Web Analytics Success Measurement For Government Websites | Occam's Razor by Avinash Kaushik -- Topsy.com says:

    […] This post was mentioned on Twitter by AllThingsM. AllThingsM said: Web Analytics Success Measurement For Government Websites http://bit.ly/KVUDy […]

  2. […]
    Building of the theme of this post and the one from this morning on social networking strategy, here’s a way to construct metrics to measure whether a college’s website is effective. The post focuses on governments, but is really about non-profit organizations, and it’s easy to see how this strategy could be applied to a college’s online presence (and a good way to measure whether the things described in the post below are working).

  3. […]
    Web Analytics Success Measurement For Government Websites – Occam's Razor by Avinash Kaushik

  4. Performance of a web site « Ljgww's Blog says:

    Interesting article about how to measure success of the web site and how to get answers on questions about site performance.


  5. […] Web Analytics Success Measurement For Government Websites | Occam's Razor by Avinash Kaushik (tags: webdesign government analytics) Socio-Encapsule this: […]

  6. […]
    This doesn’t have to be a big thing (I just offended all my content strategist friends). There’s no need for a public dialog to enhance these articles. Rather, invite users to complete a brief survey specifically about the content they’ve encountered, and make sure someone is tasked to analyze and act upon the results of the survey. We could start piloting this in a month (as long as we dutifully ignore the Paperwork Reduction Act and existing cookie policy). If we’re successful, we could add more articles.

    For a more in depth look at this approach, read Avinash Kaushik’s Web Analytics Success Measurement For Government Websites.

  7. […]
    Pour pallier à cet égarement, Avinash Kaushik nous parle de la distribution « contenu / pages vues ». Imparable selon moi pour faire comprendre aux clients que la course au contenu n’est sans doute pas la meilleure solution. L’économie que vous réaliserez en production de contenus vous permettra de porter votre énergie sur la planification et la mesure.

    L’idée derrière la distribution « contenu / pages vues », c’est de rapporter en pourcentage le nombre de pages physiques de votre site web ainsi que les pages vues pour une période déterminée.

  8. […]
    Number of views is great for little else other than bragging rights. It’s one of the “famous” metrics (web analytics guru Avinash Kaushik‘s term) that “are staring you in the face when you crack open any analytics tool” but “barely contain any insight.”

    Yep, for anyone in the content business, number of views is right up there with hall of famers number of page views and monthly unique visitors.

  9. […] but Avinash Kaushik is widely recognized as the leading authority. Be sure to read his article Web Analytics Success Measurement For Government Websites to learn about the challenge presented by a government web site. This entry was posted in […]

  10. […]
    Analytics is equally important for non-profit organizations and government websites. If you have a website, you have one for a reason. For non-profit organizations and government, the objectives may be different than for a business, but analytics are still important.
    I just re-read this blog posting about analysis of data for a government website, and wanted to share it:

  11. […]
    Del mismo modo, en los que se refiere a la analítica web para sitios públicos, los indicadores están poco adaptados a las necesidades y obligaciones del servicio público y se centran en métricas genéricas , salvo honrosas excepciones y aportes metodológicos, como este post de Avinash Kaushik (muy recomendable). Es decir, no saber qué es lo que queremos hace que cualquier cosa nos parezca como válida pero realmente ninguna sea en realidad buena. – See more at: http://publilitica.es/industria-del-gobierno-digital/#sthash.L76zF1Vl.dpuf

  12. […]
    Ever wonder what people are looking for on their government websites. check this infographic out.

Add your Perspective