Implementing web analytics is a bit like buying a car. If you do your research and planning before you go to the dealer, you will walk away with what you need at the price you want to pay. Else, you may come home with what you didn’t want and then some. For setting up web analytics correctly, preparation is just as important as the implementation itself.
Check to see if you are implementing these three key steps to get your web analytics right:
- Finding the right resources to setup web analytics
- Creating a plan for web analytics implementation
- Designing and implementing the web analytics solution
Now that we’ve gone over what you need, let’s take a look in detail at the best practices for each of these areas.
Identifying the Right Resources
Setting up web analytics is as much a technical task as it is a business task. The resource who is setting up analytics needs to know technical analytics implementation so that data is accurately recorded and measured. The resource also needs to be aware of how the data will be used. Without having the knowledge and ability to do both, the resource will be ineffective at setting up web analytics. So having just a single person implement web analytics is rare because of the mix of technical and business skills needed for the job. This is also the reason why small companies struggle with effective use of web analytics for business performance measurement – it isn’t setup right to begin with.
So the ideal team for setting up web analytics includes a developer, a technical analyst, a business analyst, and a business owner or product manager. Here is how we demarcate the roles of each resource:
Product Manager/Business Owner: This person is responsible for identifying areas on the site that need to be tracked. We are not talking about simplistic page level tracking; instead, think about a complex interaction such as signing up for a subscription on a site. The product manager needs to determine what actions the visitor takes are worth measuring. For instance, is it worth knowing whether the person is a member or a visitor? Or is it worth knowing what group of courses gets the most attention?
Business and Technical Analysts: Depending on the size of the organization, the business and technical analyst roles may or may not need to be separate. If there are a lot of business owners or product managers that have an input into what needs to be measured, then a dedicated business analyst will need to gather requirements and consolidate them into a single set of requirements. For instance, one business owner might want to apply a monetary valuation to site actions while another business owner might want to keep it simple. So a business analyst will be able to consolidate such requests across the organization. A business analyst has a strong handle on the inner workings of a business.
A technical analyst on the other hand is focused on figuring out how to convert the business level requirements into a technical specification. Such a specification may include naming conventions to use, recommending how on page variables may need to be re-coded or translated, and how the implementation will need to work in concert with other tag based services. In an ideal world, the technical analyst and business analyst work very closely together to define the web analytics solution.
So setting aside the politics that may exist in your organization, first determine whether the right set of resources are in place to design and implement your web analytics solution. Only then proceed to the planning phase.
The Planning Phase
Once the right resources are in place for the web analytics implementation, the next step is to gather requirements and set the scope. Here are some of the key steps involved.
Interview Process: The first step in planning for implementing web analytics is the interview process. Go in with a script to ask questions. What you are looking for are business problems or challenges that need to be addressed. Avoid talking in terms of the web analytics solution itself. All questions are agnostic of the technical solution. So questions should be of the following nature:
- What are your business unit’s/project’s/position’s goals, and how do you measure them?
- What are the actions you want to take using the information from web analytics?
- Do you have changes in mind that you want to get information to support/counter? What are they?
Naming Conventions: Just like an organization develops a business vocabulary that people who have worked at the organization can understand, a web analytics implementation needs to have a vocabulary that every user of web analytics can understand and appreciate. Whether this means abbreviated words or full length words for variables, there needs to be consistency in naming so that analysts can understand the scope of a dimension or metric without having to ask.
Building for Level of Skill: The extent to which analytics will be used depends on the level of expertise available within an organization. Care should be taken to design a setup that can be appreciated and used by analysts at an organization. If the setup is too complicated, it will be at best ignored and at worst misinterpreted by analysts. For example, we see evars and identical sprops defined all the time in Adobe Analytics implementations. While that in itself is a red flag, what’s worse is that analysts then use these evars and sprops interchangeably.
Setting up analytics should be done in a phased manner so the complexity increases as the skill level of analysts increases.
The Design & Implementation Phase
For web analytics implementations beyond the basic page tracking, there are several aspects to be considered. These range from understanding data sources to creating custom variables to meet business needs. Let’s take a closer look.
Data Gathering: Data can be fed into analytics packages in multiple ways. The most common way is through data elements in the analytics tag container itself. But there are other ways that data can be stored and passed into analytics. Here are the various ways:
- Hard coded variable values within the tags – Used typically to pass static page level information such as the page name, category of product, name of product, etc.
- Query string parameters in URLs – Used to send channel and campaign information as well as other dynamic pieces of information
- Variables stored in the data layer of a tag manager – Provides a highly flexible and versatile repository to store information.
- Browser cookies – Used to store user level and session level information in addition to event level information
- API feeds from other systems – Used to send third party information typically not available on web pages
The level of data persistence generally increases from methods 1 through 4. So as a rule of thumb, add the least persistent elements via method 1 and the most persistent via method 4. It is not uncommon to find data elements passed in inefficiently because of legacy decisions made for non-analytics purposes. For example, the session id and information may be passed via a query string parameter because of browser and tab compatibility issues in the past. When designing an analytics setup, care should be taken to understand how the data came into being and why it is being surfaced in the way it is.
If using a tag manager, the preferred way of implementation is to use the data layer provided within the tag manager to store information. Once information is stored in a single repository, debugging of that data in the future becomes very easy.
Getting accurate, consistent data is the hardest part of a web analytics setup. Frequently, it requires negotiation with business as well as technology teams. And inefficient decisions are made in the interest of meeting project timelines. A business or a technical analyst should document such instances and champion their fixing so the analytics setup becomes easier to maintain and customize in the future.
In the real world, budget and time constraints often drive analytics project teams to override data values present on web pages so they make sense in the analytics tool. For example, if the name of a product is not consistently carried through from a product detail page all the way into checkout, then the analytics tool will not be able to associate the flow of information to the given product. When such instances occur, care should be taken to think through the best approach to address the problem. The ideal solution frequently is to address it at the source. But it isn’t always possible. Then technical analysts need to design a customization strategy at various levels so the data overrides are easy to maintain and debug in the future. Remember, every customization or override represents a point of failure that will occur if the underlying data changes. For instance, if the override is designed to change the name of a product when the product id is 999 and a business analyst changes the product id manually, then the override will fail and the analytics data will be incorrect.
Data overrides can be done in data layers, in browser cookies, or in the analytics tool itself using filters or other overrides. Mindful use of these options can result in an easy to manage analytics setup. Haphazard web analytics implementations cost more in terms of lost data and corrective measures than those that are well thought through.
All enterprise level analytics tools such as Google Analytics, Adobe Analytics, and Piwik provide ways to make performance measurement easier through dimensions, segments, custom metrics, evars, sprops etc. Depending on the type of tool being used, there are limits on the number of custom dimensions, events, etc. available for use. So care needs to be taken to ensure that the most common and important needs of business owners are addressed first through such custom variables.
In addition, some tools also provide feedback on how frequently a custom dimension or metric is used. By monitoring such reports, web analytics administrators are able to effectively manage an organizations needs within the allowed quota of custom variables.
Once implemented, a web analytics setup needs to be actively managed. Here are few best practices we recommend:
Annotation: No web analytics analysis can be complete without context. Frequently, a majority of the ad-hoc analytics time is spent finding the external or internal events that caused analytics performance to change significantly. By implementing scalable annotation techniques, web analytics administrators will be able to channel their time into more value add activities. For more on this topic, see our article on web analytics annotations.
Documentation: Just like good source code is well commented, a good analytics setup has robust documentation associated with it. Documentation will include definitions of custom dimensions, custom metrics, data overrides, etc. And the documentation is only as good as it is maintained. So web analytics administrators should take care to keep analytics implementation documents regularly updated.
Constant Curation: A web analytics implementation needs to keep pace with the changes on the website as well as with changing priorities of the business. Whether the organization has a formal release cycle or whether changes are implemented in a ‘rolling’ fashion on the site, web analytics administrators should collaborate with development and business teams to know how the web analytics setup needs to change on a regular basis.
Auditing: The organization’s website is a constantly evolving entity with changes being made to content, code, or even other marketing tags. It is not uncommon for such changes to impact analytics tracking of a site. Consequently, web analytics administrators will be served well by auditing the core and most important aspects of a web analytics implementation on a regular basis.
A web analytics implementation should be treated as any other well run technology or business project with careful planning, design, collaboration, implementation, and testing. Once implemented, this web analytics setup needs to be regularly maintained to ensure accurate web performance measurement.