Welcome to another 5 minute guide for web analytics. Why 5 minutes? Because you have a lot on your plate, and you shouldn’t have to wade through pages upon pages of posts to get to your answer. So why not just snap to it?
The Meaning of Annotation in Web Analytics
Annotation in web analytics is the act of documenting events that occurred on a given date for a website. The actual annotation can be done in a variety of formats, which we will get into later on in this article. What gets documented can be broadly categorized into marketing, analytics, technology, optimization, and external buckets. The purpose of annotation in web analytics is to make analysts more efficient by providing qualitative context to metrics. Let’s understand that in a bit more detail.
The Need for Annotation
The reason annotation in web analytics is very important is this – when a change in metrics occurs, the first question on everyone’s mind is “Why did it happen?” The answer to that question could be hard to find if there is more than one resource involved in the marketing and upkeep of the site. And it is even harder to find if there is no record of internal and external changes that could have affected metrics.
Here is a sample of the changes that happen with fair regularity across a site managed by a small team:
- Marketing: Starting and stopping of campaigns
- Analytics: Changes to definitions of dimensions and metrics
- Technology: Software releases and break fixes
- Optimization: Launch of A/B tests
- External: Promotions on competing sites
While these changes may seem innocuous to each team, the impact on analytics could be fairly significant. Hence having an ongoing list of changes allows web analysts to focus on insights and reporting instead of spending time collecting change lists.
Now the effort required to keep track of such changes is a function of the number of teams involved in making changes to the site. Some organizations have a highly evolved communication strategy while others allow functional units to set their own protocols. Let’s take a look at a few effective ways to annotate site and environment changes.
Types of Annotation Processes
1. Tool Driven Annotation
When the responsibility of annotating is limited to a small team, then people often tend to use the most easily available annotating process they are familiar with. For example, Google Analytics, Adobe Analytics, and Piwikallow annotation within their respective interfaces. The advantage of this approach is that the web analytics team that needs this contextual information the most has it easily available without having to integrate with any other tool.
The inherent limitation to this approach is that it requires the documenter to have knowledge of the analytics tool. This approach works best for small organizations or where the web analytics annotation is a reference to a much deeper and richer annotation already setup across the enterprise.
2. Shared Document Annotation
As teams get larger, the need for cross-functional communication increases, as well. Often the easiest way to share notes is by literally sharing documents that contain changes. There are several tools and services that allow this level of sharing. Some popular tools are Box, Evernote, and Google Docs. However, to be clear these aren’t annotation tools or services. And they don’t provide any special features to the collaborators to make annotation easier. So in such situations, the quality of annotation depends on the level of adoption of this process across the organization.
3. Highly Collaborative Annotation
On the extreme end of collaboration is the organization that leverages private social networks to foster collaboration across the organization. Microsoft’s Yammer and Twoodo are examples of such private social networks that offer interesting ways to collaborate and annotate. In this approach, annotation can be done by leveraging hashtags. For example, an annotation message could be “release 2015.5 is now live. Notes can be found here https://bit.ly/rlnotes #20150525 #sitechanges.” Now the task of finding changes for a particular date becomes a trivial hashtag search on the social network’s news stream.
Of course, the quality of annotation in #2 and #3 above depend on the participation of various teams. Given that and the amount of investment needed to equip the organization, “Tool Driven Annotation” is still the popular option for annotation in web analytics.
So let’s review some of the best practices in annotation regardless of the tool or service used.
Best Practices in Web Analytics Annotation
In many cases, there is a character limit imposed on the size of the annotation. In addition, the tools don’t always support color coding of text of the annotation. As a result, it helps to have a highly defined format for annotating site changes.
Here are the factors to be considered when creating the format for annotation messages:
- Duration of change
- Type of change
- Depth of information
- Team responsible for change
- Web analytics assets impacted
- Type of adjustment needed
- Duration of change: Marketing campaigns and promotions have a start date and an end date. So do AB tests. Not having the end date in the annotation is a common mistake analysts make when using annotations. Think of a date format you want to use and apply it in the single string about the change. While brevity is key, make sure you include the year, date, and month into the start and the end dates.
- Type of change: Identify the big buckets of changes that happen on your site and use those to identify the type of change. A few common buckets are marketing, analytics, technology, optimization, and external.
- Depth of information: If the change can be easily described, then mention it. Software releases typically have more information than can be possibly included in a single annotation. Hopefully, your technology team stores the release notes online. If not, setup a small internal WordPress site and post the release notes. Chances are your technology team will pick up the ball and roll with it. Include a shortened URL to the detailed changes.
- Team responsible for change: This one will save you a lot of time. Just because the change was an A/B test, there is no guarantee that it was actually run by the conversion rate optimization team.
- Web analytics assets impacted: By assets, we mean the type of data in web analytics. For example, if the change was an analytics change, then did the definition of a dimension or metric or evar change? Make a note of the key change in as few words as possible. Remember, you are communicating to web analysts so they will understand this one easily!
- Type of adjustment needed: Once the change has been identified, you may want to list out how you had to account for the change in your reports. For example, if it was a rash of robots that hit the site, you will want to explain whether you used a spam filter to remove them.
Now that we have looked at the key things to mention in your annotation log, it is time to pull it together in a single string. Document the format outside of the analytics tool or in another annotation string and you’re all set. For example, we at Bay Leaf Digital are fans of using ‘|’ as the delimiter. So our annotation looks like this:
MMDDYY-MMDDYY | Type of change | Brief description of change | Person responsible | web analytics assets affected | adjustments needed to reports if any.
Hopefully, this overview has helped identify the need for annotations. For more information on how to use annotations in specific tools, check out our articles for Piwik, Adobe Analytics (TBA), or Google Analytics.