Web Analytics Technical Implementation Best Practices. (JavaScript Tags)

Red Spiral For some reason there is so much on this blog about the “business side” of things but not much technical stuff. So to make up for that this post covers some of the “technical” best practices around tracking with your web analytics tools, especially if you are using JavaScript tags.

Often the job of implementation is left to the friendly neighborhood IT person and your web analytics vendor. Yet there are numerous business decisions that need to me made during the course of implementation, many of which have huge data implications. Hence it is imperative that Web Analysts and Website Owners and Decision Makers are actively involved during the implementation process. It bears repeating that a successful web analytics implementation is not just the IT guy / Vendor’s job.

It is important to point out that this post is written with the business side user in mind and not the deep technical expert God (to those of you in that category perhaps the nine recommendations below will be rather obvious). My hope is to make our business brethren just a smidgen smarter about implementation issues (javascript tags or otherwise) and attempt to share with the technical amongst you a few critical decisions you should expect your busienss folk to make (and how perhaps sometimes technical sexiness can be a data collection hurdle!).

As always your individual vendor will be the best source for unique implementation guidelines. Please do not overlook that important resource, while keeping in mind that they don't know your site or your business hence you'll have to bring that "expertise" to the table.

Summary: Here are some implementation best practices, in no apparent order of importance….

    # 1: Tag all your pages.
    # 2: Tags go last (customers come first :).
    # 3: Tags should be inline.
    # 4: What’s your unique page definition?
    # 5: Use cookies intelligently (they are delicious).
    # 6: Javascript wrapped links might be a issue.
    # 7: Redirects, be aware of them.
    # 8: Validate data is being captured correctly.
    # 9: Don’t forget Flash, Flex, RIA, RSS, Videos etc.

Here are the best practices in all their gory details……

# 1: Tag all your pages.

    Seems fairly straight forward. :) You should tag all your pages simply because with javascript tags, more than with other methodologies, if your page is not tagged you have no data and you have no way of going back and finding it (short of looking in your web log files which can be a non-trivial challenge).

    In the past I have recommended Web Link Validator from REL Software as something we have had a positive experience with. It can do a lot more than check missing tags, see the website for all the features, so all around a good piece of software to have. It runs between $95 – $795. I am not affiliated in any way this this company and do not benefit in any way from recommending them.

    Best practice is to run this nice little program, or your own equivalent, once a week and send a report to your web development team with a list of pages missing the tags.

# 2: Tags go last (customers come first :).

    In many web analytics implementations on the web you’ll see the tag right at top or in the header or before the <body> tag. This is sub optimal. Your javascript tag should go as close to the </body> tag as possible. The simple reason for this is that the tag should be the last thing to load on the page. In case your analytics server is slow in responding, or has simply died (less likely), then atleast the webpage and the content will load quickly.

    Our websites are there for customers and secondarily for us to collect data from. : )

    Update: The only exception to the above rule is if your web analytics vendor has a async version of the JavaScript tag. If so the tag will be placed at the bottom of the [HEAD] section of your page. The async code loads, well, asynchronously, which means that it does not impact your user experience (even if the vendor's servers are dead). Another cool benefit is that the async code collects more data, more accurately.

    At the moment Google Analytics is the only vendor to provide this type of improved tracking. You can learn more about it here: What is the Google Analytics Asynchronous Tracking Code? If you use GA, run and switch to async now!

# 3: Tags should be inline.

    This one often comes back to bite many implementations. Golden rule: javascript tags should be inline. They should not be placed in delightful places such as inside tables or frames and such things. It will greatly impact your ability to collect data accurately. Enough said.

# 4: What’s your unique page definition?

    Increasingly websites are becoming dynamic in how they react to customers and how they personalize content and how they “re-leverage” the same .html (or .jhtml or .asp or .jsp etc) page to do different things. What this means is that you can no longer rely on the product-name.html to define a unique page identity.

    Javascript tags, and perhaps all other methods, collect that entire url along with all the parameters in the stem. During implementation (and indeed if you evolve your site often) you will have to make sure that you “teach” your web analytics tool which combination of file name and parameters identifies a page.

    For example for this blog, which is a static site, here is a random url:


    the .html simply identifies a unique page.

    But for the Best Buy website:


    it is olspage.jsp AND the parameter skuId that possibly define a unique page. If in this case you plonked a web analytics tool without identifying what makes a page unique you would obviously get wrong numbers.

    Cute example of how hard this can be here’s a real url from website, can you guess what identifies the unique page? :)


# 5: Use cookies intelligently (they are delicious).

    Use first party cookies as much as possible and not third party. There are a billion words about this topic so I’ll stop here. Ok one last time, use first party cookies! :)

    There are three types of information you will collect, source attributes, page attributes and “user” attributes (non-PII please).

    Source attributes: where do people come from (websites, campaigns, search engines etc etc).

    Page attributes: what do they see, how often, where, from where to where, page grouping in all your content etc.

    User attributes: who is this “person” (persistent anonymous id’s, has a login or not, are they in a test, and more).

    Usually, stress usually, Source and Page attributes are best captured via URL’s and Parameters. User attributes, again stress on non-PII (please be careful about this and disclose in your Privacy Policies explicitly what you collect), are best stored in cookies.

    User attributes, those identifiers (non-PII) that are persistent pan session are best stored in cookies. They will stay in the browser and can be easily read by your tags without have to stuff your URL’s and make them fat.

    Sometimes User Attributes, like anonymous cookie value or your login to Times Select say on the nytimes.com website, tend to be held post session initiation on the server. If this is the case then please be aware that your javascript tags are blind to that data.

    One last thing on cookies, please be aware that IE 6 limits the number of cookies to 20 per domain. After that it starts blowing away your first cookie and then the next etc. Not nice. There are ways to get around this like consolidating cookies or by using sub domains. Please check how many cookies you are setting In Total from all things that are on your website and work with your developers to address the issues, if you have any.

# 6: Javascript wrapped links might be a issue.

    There are often links on the website that are wrapped in javascript. Usually pop-ups, but could be for other purposes. For example this one:

    javascript:var x=window.open('http://qqq.links.ppp.com/pages/prices.asp?)

    Be aware that if you are going to be using reports like Site Overlay (Click Density) that these links might not show the number of clicks in that report due to the javascript “wrapper”. This is not a issue with all vendors but with enough of them that you should be aware of it.

    The recommendation is to use javascript “wrappers” on links when you absolutely need it. Remember this is not just a problem for web analytics but also for search engine Robots / Spiders. They don’t follow javascript links (or execute javascript) so they will also not reach or index that valuable piece of content you have wrapped in javascript (so bad for SEO).

# 7: Redirects, be aware of them. 

    Redirects are nifty little things, they can direct traffic in a nice way in case links change or if your SEM / Ad / Whatever agency can track data or, in the good old days of web logs, you wanted to capture clicks where you send data off to other websites (domains). But redirects can also mess up your web analytics data collection in a big way if not done right (and it will also, like above, possibly mess up your indexing quality by search robots). Let’s cover two instances to outline the data collection challenge:

    If you have “internal redirects” (redirects that simply lead from one page of your site to another) then this can be sub optimal. Look at this link from microsoft.com…..

    http: //g.msn.com/mh_mshp/98765?09769308&http://www.microsoft.com/down loads/search.aspx&&HL=Downloads&CM=Navigation&CE=Resources

    This goes from the home page to a sub directory on the same site. Not sure if MSFT is doing this to overcome any challenges with their web analytics tool but it is quite unnecessary, an extra hop for the data and it can cause a issue, so you have to make your tool smarter that from the home page, www.microsoft.com, people not going to g.msn.com but rather they are going to www.microsoft.com/downloads, , and that is logic that you have to maintain over time (which can get complex as the site changes), you also have to capture and store extra data and this can potentially cause problems when you are deep into segmenting the data.

    It is important to stress that it is quite likely that microsoft.com is doing this because they have figured all this out and it works for them. The core point is that with this illustration you should be aware of the complexity it can pose in measuring and you should go into it with your eyes open (and support of your website IT folks who might have implemented these things).

    Another instance of using redirects is while linking to other websites, outside your own….

    http: //www.removed.eliminated.com/Navigate.asp?Category=238

     A final example of using redirects is as a part of campaigns (banner ads, search marketing, affiliates etc).


    goes to:


    ends up at:


    So one click causes two hops (where data is collected by someone else outside your company at each hop) and the customer ends up at your site. Does your web analytics application have any knowledge that the user came from an Overture ad? If this was not proactively thought through the answer is usually No.

    There are two important things to consider here to ensure you can report and analyze data accurately:

    1) Work with your agency (or internal resource) to ensure that there is atleast one parameter that gets passed from one hop to the next hop to you so that you can track accurately the campaign, it could be the parameter “sourceid� above

    2) Please ensure that everyone is using 301 Permanent Redirects where possible. This will help ensure that the original referrer is passed on to your web analytics website (else your referrers report, your search engine and keywords report and a bunch others will be completely wrong).

# 8: Validate data is being captured correctly.

    Some web analytics tools use one standard tag to collect data. Other vendors have a custom tag all over the place (so your website could be tagged with 25 different tags for your vendors on different pages because your vendor needs lots of data to be place in customised variables up front for post data collection analysis or they are capturing various pieces of data like order / lead information).

    I won’t pontificate on which approach is better (there is no such thing, both have pro’s and con’s). But it is important that you validate in QA and production that your 25 customised tags are each capturing exactly what they are supposed to.

    I know that Omniture has a nifty utility that you can use to validate and review that data is being collected by Omniture tags as it should be. This is really nice and helpful and I do like it very much. Please ask your vendor if they have something like this (and they probably do).

    It is recommended that you check your tags and data collection once a month to validate that normal site releases have not messed something up.

# 9: Don’t forget Flash, Flex, RIA, RSS, Videos etc.

    “Normal” javascript tags won’t work with these rich media experiences. If you have lots of rich experiences you’ll have to have a completely different and deeper (and more painful) strategy to collect data. You will have to use custom tag or standard tags in custom ways or different data capture mechanisms like “event logs” to collect this data.

    I won’t go into lots of detail except to say that during implementation you should think way up front in terms of tracking rich media. For some recommendations on how you can track and what to track please check out the podcasts I had recently done with the Web Analytics Association.

    Tracking rich web experiences requires a lot of deliberate planning and implementation up front before anything gets released to ensure that via you web analytics tool or via a custom solution you are able to track some semblance of success.

I am positive that all the technical complexity in this section causes heads to hurt, especially if you are on the business side. It is complicated, it seems really hard and in your spiffy vendor presentations many of these things were not highlighted. But there is simply no other way of ensuring that you are collecting data accurately other than looking at each of these items (and others that you might be doing that are unique to your website) in a very deliberate manner and doing so up front.

It is important for you to personally understand these issues so that you can ask the technical folks (yours and your vendors) the right questions and after you get your first data check to make sure some of these things have been done right so that you can have confidence in the data.

It would be wonderful to get some of your own tips and best practices from your experiences. Please share your best practices, or just your war stories, via comments. Thanks.

[Like this post? For more posts like this please click here.]


  1. 1

    Nice post Avinash. Couple others I would add:

    1) Sort out before you implement whether you want to capture pages in your analytics tool by the value of or the page path.

    2) Consider whether you'll be organizing content groups by Path or by some other method (like Eric's Web Analytics Demytified example of site goals).

  2. 2

    Two great tips from the Digitari Blog:

    Avinash, references the debugger tool from Omniture which can be used to see what information a particular page is sending to SiteCatalyst.

    To check the integrity of our implementations, we also use a tool called “IE Watch Professional” which functions much the same as the Omniture debugger but is analytics vendor agnostic it exposes page status (404 errors) and more. There is a tab called Query String under HTTP Analysis which displays all of the information being sent back to WSS, Omniture etc. in a neat table.

    # 2

    Another challenge Avinash mentions is links that use javascript.
    One method I like, and which also benefits SEO, is executing the javascript and returning false before the href value is called.
    For example (you can disregard the line breaks):


    onclick=”winPop(’http://www.digitaria.com’);return false;”

    A link like this allows javascript enabled browsers to open the digitaria site in a popup window. Browsers without javascript will be directed to the digitaria site in their current window. (Search engines don’t have javascript – and it looks like a normal link to analytics solutions). Adding return false at the end of the javascript link prevents the visitor from seeing the digitaria site in both the popup AND in their current browser window.




  3. 3

    For item #8, there are various techniques to check if the tags are correct. Fiddler and IEWatch are great on MSIE and there are a number of extensions for Firefox.

    I've been working on WASP, the Web Analytics Solution Profiler, a Firefox extension aimed at web analytics implementation specialists, web analysts and savvy web surfers who wants to understand how their behavior is being analyzed.

    Still in early beta, I'm close to releasing a new version that fixes a couple of minor issues and offers new features based on the feedback I got.

    Stephane Hamel

  4. 4
    Chad Parizman says

    # 8: Validate data is being captured correctly.

    We've tested a lot of debugging tools and found Charles (http://www.xk72.com/charles/) to be the best product to date. It groups tags by domain to easily see what exactly is being called on the page. One of the best parts is that I've successfully been able to run it off a USB thumbdrive, so I can debug on any machine.

  5. 5

    Hi Avinash, great post as always ;)

    Allow me to provide extra feedback on item #2:
    apart from script loading considerations, the reason why you'd want to put your script call(s) as close to </BODY> as possible is because most Javascript-based metrics tool work in Web beacon mode and return a pixel which, as transparent as it may be, can and will mess up your page layout if placed anywhere else ;-)



  6. 6

    Hi Avinash. I am probably the demographic this post is written for — a marketing person who knows just enough about the technology to be dangerous. (So thank you, and with your help, I will be less dangerous, and certainly my own blog works better because of you.)

    Anyway, what do you and Julien think of the onclick problem that Google Analytics poses? In order to tag onclick events or much of the stuff you note in #9, the main part of the analytics has to be called first. So if the click happens high on the page, the urchintracker code probably needs to be right after the tag.


  7. 7

    Sorry, I forgot that your blog would execute the html. Anyway, the urchintracker code probably needs to be right after the open body tag.

  8. 8

    Robbin, that's a non-issue. Look at the example code below:
    (hope this comes out right)



    <!– header stuff goes here: META tags, title, JS, CSS, RSS calls, you name it–>


    <a href="javascript:;" onClick="foo('stuff');">Clicky</a>
    Your content here –>
    <script type="text/Javascript">

    function foo(bar){

    alert ('omg the code for this behavior is declared in the footer\n\rand yet it works upper in the page!');





  9. 9

    Bleh, indents got block-quoted :P

    Oh well. I rest my case ;-)

  10. 10

    I don't think so but I know you guys at OX2 are incredibly sophisticated so I won't pretend to know something that you don't. (And like I said, I'm just technical enough to be dangerous.) I mostly can pull up the Google Analytics guide and quote it at you:

    Important: if your pages include a call to urchinTracker(), utmLinker(), utmSetTrans(), or utmLinkPost(), your Analytics tracking code must be placed in your HTML code above any of these calls. In these cases the tracking code can be placed anywhere between the opening body tag and the JavaScript call.

    I went through your example to see if I could find a special way that you got around the problem, but didn't see it. Tell me I'm wrong –


  11. 11

    Robbin in this case, you need to include JS code earlier in the page because variables initialized in the JS include are then modified by subsequent calls to _utmFunctionNameHere functions
    in this case only does your tracking code need to be placed higher up in the code, most preferrably within the HEAD tag IMHO.

  12. 12

    Excellent post. I find that proper implementation is critical with web analytics. All too often we have users who have botched the installation process in one way or another (usually involving PHP or other sites) instead of taking it as a step by step approach.

  13. 13

    Here are some points to consider: –

    1. Use the TRACKING CODES appropriately in your Marketing Data Feeds.**VERY VERY IMP (Keep in mind the % of MARKETING BUDGET as a part of SALES)

    2. Send Regular Data Feeds from your CRM system to your Analytical system so that; you could view the TRAFFIC data for multiple Customer Segments. This strategy works for B2B sites.

    3. Send regular DATA FEEDS from your Order Management system to your Analytical Solution so that; you could club your OFFLINE sales with your ONLINE COMMERCE data and enjoy the consolidated view of your business.

    4. If your company sells products on EBAY or OVERSTOCK then schedule a daily feed for your ANALYTICAL solution

    5. PAGE NAMEs are the most important piece of information on the site and most of the companye don't use them consistently so; be cautious in this regards.

    6. Keep special eye on FALLOUT REPORT and SINGLE PAGE VISIT report because; solution for most of the analytical problems lies in them.

    I can write 100 more points but; above mentioned are some of the important ones.

  14. 14

    Hi Avinash,

    Regarding Item # 5:

    Use cookies intelligently (they are delicious)


    One last thing on cookies, please be aware that IE 6 limits the number of cookies to 20 per domain.

    I believe this is a browser issue, not just an IE issue. So the 20 cookie limit also applies to FireFox and, I would assume, Safari.


  15. 15

    When you refer to the nifty tool that Omniture has to validate tags are you referring to the Debugger?

    – – – – – –

    Avinash: Yes Zack. Please check with Omniture's Tech Support.

  16. 16

    Have you heard of anyone using the SiteCatalyst debugger like a sitemap to check an entire site instead of just one page at a time? In theory the dugger tool could be used in this fashion.
    I was also trying to customize a Visio web sitemap to capture page tag info but it will only return the meta tag titles and URLs.

  17. 17

    Hello Avinash,

    I've just read this section in your book and came on your blog to see if you had anything more about the issue of redirects. As you know a large percentage of the traffic that arrives on your website comes with the referrer camp blank. I know that there are many reasons behind this, bookmarks, server issues, etc, but one of them is faulty redirects.

    Are there any ways at dealing with these to glean any useful information, to help segment the data better?

    The book is great btw!

  18. 18


    Related to recommendation #2, is bottom placement (near the tag) still a best practice? GA is currently recommending placement "just before the closing tag". Has asynchronous tracking changed the game in this case? GWO recommends similar placement.

    To my knowledge other enterprise solutions are asynchronous as well and I wonder how that impacts those implementations as well.

    Thanks for any thoughts you might lend.


  19. 19

    Nate: You are absolutely right. If people use GA they should immediately switch to async and the code does go on top of the page (in the HEAD) rather than at the end. Because it works asynchronously this means no impact on the customer or site experience, AND you get more data more accurately.

    I've also updated the article, thanks for highlighting this.


  20. 20
    Sara De Ceulaer says


    Very interesting all this, I still have one question though:
    For SEO purposes it is recommended for a website with www and non-www domain name to choose a preferred one and redirect the other to the preferred with a 301 redirect. We usually do this through IIS, not with a javascript redirect.

    How does this show up in the analytics, how do we have to handle this?

    Thanks in advance,

  21. 22


    Quick question, I just realised that an old blog of mine still uses the old standard version of GA tracking, so I guess I should upgrade to asynchronous tracking (as recommended in my Market Motive course), question is, will I lose my old data when I switch over, should I add new script and just delete old script? What is the best method to transition the change?

    Sorry for the noob question.



    • 23

      James: Switching from the old Google Analytics tag to the new (and much awesomer) async tag will not have any impact on your old data. That will all stay there and work just fine.

      The benefit of the Google Analytics async tag is that you'll collect more data, more accurately and you'll be able to use new features in GA that you can't with the old tag.


  22. 24

    Hey there,

    I came across this post as I'm looking for best practices in terms of HOW a briefing (call it a Tagplan for instance) for a developer who must place the tags into the code should look like.

    For instance, using an excel doc which lists all the objects of the application which should be tracked and then allocate all the Site Catalyst eVars, props, events and so on could be one way to brief the developer.
    However, the form of the briefing is highly dependent of the used tool and the framework parameters of the project context I experienced.

    What are your experiences concerning briefing technical colleagues which are in charge of implementing all your tags which you have defined so your reporting requirements can be fulfilled?

    Best regards,


    • 25

      Seb: Increasingly the use of various tag management solutions makes this process significantly easier. Some of them include a process where the "normal people" (you!) can submit the requirements, they can then be QA'ed by the IT team, and if it passes all the checks it can be deployed.

      Check them out, some of them are:



      These solutions, and others, do a lot more than just manage the tags. You'll learn more on the site.


  23. 26

    We are about to roll out Adobe Analytics to replace the current Webtrends Implementation we have in place, as we are going to run CQ5, Target (new test and target), AA to allow better integration, we are also then augmenting this with VoC (Opinion lab or Foresee) as well as Click Tale. I was wondering if you had a quick guide to best practices when doing this sort of exercise.

    My boss is happy to let me lead the delivery to give me the experience, we are looking to run GA, WT and AA in at the same time to triangulate, I am aware that each tool collects differently etc but the reason that we wanted to do this is to map trends and try to pick out any errors in our AA implementation.

    The directors want to empower our team to drive cultural changes in behavior moving forward using test on launch and constantly testing to build better products and experiences.

    However we will come up against heavy resistances as the will shift budget from other teams therefore I want to make sure I dont mess up and I have deliver a brilliant implementation.

    Any help or pointers to videos other posts etc would be great.

    • 27

      Roman: What you are attempting to do is incredibly complex (and pieces of what you are planning I would aggressively encourage to you drop completely). I would recommend that you higher an external consulting company who can do this right, the first time. Usually with these types of initiatives it takes 18 months to implement, in month 15 you realize 30% of it was wrong and 50% of the needs have changed. That leads to another 14 months of implementation. During all this time the business go nothing. :)

      And it does not matter if you are using Adobe Analytics of any other tool. My recommendation above stands. Reach out to the vendor, they'll recommend a ton of independent consultations you can pick from.

      I'll recommend two posts…

      Here's a post that might help in the triangulation you are planning to do:

      + The Ultimate Web Analytics Data Reconciliation Checklist

      I also recommend that you checkout my DC – DR – DA framework and ensure that your macro strategy so solving for the recommendations in the post:

      + Web Analytics Consulting: A Simple Framework For Smarter Decisions


  24. 28

    Thanks a lot avinash.

    This is really very nice article for everyone who want to understand that what is "Implementation" role in web analytics industry.

    As my point of view Web analytics implementation is most desired learning prospective for all professionals .


  1. GoBloggit says:

    Web Analytics Technical Implementation Best Practices. (JavaScript ……

    Open-source products tend to be driven by what features the user base is actually asking for vs. those being determined by a commercial software company's profit motives. Another open-source advantage is freedom from vendor dependency, the seemi…

  2. JavaScript advice…

    There's a good post over on Avinash's blog with 9 tips for getting JavaScript tagging right on your website. There's not much for me to add except to say that you should click the link above and read his advice;…

  3. […] Web Analytics Technical Implementation Best Practices. (JavaScript Tags) » Occam’s Razor by Avinash Kaushik (tags: analytics webanalytics web article blog dev internet **) […]

  4. […] First, take a look at this quick list of implementation tips from our good friend Avinash. That post is almost 2 years old, but I think it still holds true today. If you’re implementing a proprietary tool like Omniture or WebTrends, make sure to talk to your vendor about your business and web site goals before implementation. […]

  5. […] Die link leidt nergens naartoe wanneer javascript uitstaat. Nou hebben de meeste mensen wel javascript aan, maar zoekmachines niet. Daarom kunnen veel sites met javascriptmenu’s een hoge score in zoekmachines bij voorbaat al vergeten; er valt voor een zoekmachine weinig te indexeren. Daarnaast blijven er altijd mensen die bijvoorbeeld vanwege bedrijfsveiligheid javascript niet aan hebben staan (ongeveer 5%). En zelfs voor je webstatistiekenpakket kan het een probleem zijn. Kortom, zorg dat elke link ook zonder javascript ergens naartoe leidt. […]

  6. […]
    If this makes little sense, or if your thoughts are around, what’s the point? go read a) Avinash’s technical implementation post (which by its breath shows the pain in deployment) and b) Ian’s post about whence the universal tag? (which provide other great viewpoints on how to solve the current pain).

  7. […] Digital Marketing and Analytics Blog: Web Analytics Technical Implementation Best Practices (JavaScript Tags) by Avinash Kaushik https://www.kaushik.net/avinash/web-analytics-technical-implementation-best-practices-javascript-tags… […]

Add your Perspective