Moving From Web Analytics To Web Evaluation

Measurement1

With this post, we’re introducing a series focused on “tools and tips” that can help you navigate the many online options available to bolster your communications activities and ensure they support your overall strategies. In the first of a two-part series, Louie Herr, a consultant based in Portland, Ore., discusses the “trinity” approach to web evaluations.  

Guest Post: Louie Herr

Like many people, I used to think that web measurements primarily revolved around getting  a fix on how much time people spend on site, what they clicked to read, the last page they viewed before exiting,  etc., and then using those findings to make improvements to hold on to visitors longer. Then I read Avinash Kaushik’s Web Analytics: An Hour A Day and it completely changed my attitude about evaluating websites.

Kaushik, author of the leading web analytics and research blog Occam’s Razor, argues in An Hour A Day that clickstream-based web analytics aren’t very useful on their own. Sure, the web team can look at things like exit pages and time on page and theorize about ways to improve the site. However, limiting web evaluation to a clickstream product like Google Analytics starves you of critical information. Are visitors to the website accomplishing the tasks for which they accessed the site? What challenges are visitors encountering as they attempted to complete those tasks? How satisfied are visitors with their experience on your site? Answers to these critical questions can lead to far better informed outreach.

The goal of Kaushik’s approach is actionable insights — information that spurs changes that help your organization meet its communications goals. These insights come from combining three families of reporting: Behavior, Outcomes and Experience –what is called the “trinity” approach to web evaluations.

Behavior – Behavior reports ask the question “What can we infer about visitors based on data yielded during their visits?”

Behavior data is clickstream reporting that comes from trackers like Google Analytics. For it to be most useful, you need to get to know your tracker and the reports it can generate. Kaushik suggests drilling deep into reports and, for example, analyzing site search keywords to determine what visitors look for when they visit, or viewing site overlays to better understanding of how visitors navigate the site.

An important note: clickstream data can only generate inferences, and different inferences can be drawn from the same data set. This is important to consider when reviewing this data with your team.

While the first part of the trinity is safely within “traditional analytics” territory, exciting things happen when you combine clickstream reporting with data from other sources.

Outcomes – Outcomes analysis asks the question “Is my website getting it done?” “It,” of course, is the goal of your website’s existence.

The first step in Outcomes reporting is to determine the goals you have for visitors interacting with your website. If you haven’t codified those goals, don’t feel too badly. It is likely that you are in good company with many other readers. This evaluation will not be very effective until those goals are known, though. Undertaking a project to improve outcomes on your website can be a great launching point for this discussion.

Conversion tracking based on clickstream data is one way to look at specific goal pages on your website. Even though the nonprofit world doesn’t have easy-to-identify conversions like add to shopping cart or make purchase, there will always be sign up for our newsletter or support our petition pages. Setting these as conversions makes it easier to find improvement opportunities in the funnel of interactions leading to those goals.

Pop-up surveys can be an even better source of information on outcomes for visitors to your website. With a survey, you can ask users directly whether they accomplished the purpose of their visit, and, as a bonus, ask open-ended questions about what might have tripped them up.

There are many free and affordable tools to survey visitors to your website. Use of web surveys doesn’t have to be limited to evaluation of your website, and can also be employed to inform other decisions.

Experience – Experience analytics ask the question “Why are visitors behaving in this way?”

People on the web team can infer all they want, but your website is designed as it is because of their collective ideas about how a website should function. Those same ideas will inform their analysis of the site. It is impossible for them to adopt a user’s perspective. They have too much skin in the game.

Luckily, many methods to retrieve visitor perspectives are available. Surveys can offer experience data in addition to outcome data — further reason to use them in web evaluation. Heuristic usability reporting, where a usability expert reviews your website for deficiencies, can also be helpful. While not using actual visitor perspectives, heuristic analysis can still catch many issues and spur a lot of positive improvements.

The ultimate in experience testing is participant-based usability analysis. Whether conducted in a laboratory setting or in the visitor’s natural use environment, usability testing provides data of unrivaled richness. In addition to paper-based reports, deliverables will typically include screen activity videos which reveal the way visitors actually use your website. Their behaviors are often surprising and will almost certainly depart from your expectations.

Many discount the idea of usability testing immediately due to assumptions about cost. A variety of tools are now available that make usability studies possible at a variety of price points.

More Than The Sum – You probably have at least a high-level understanding of what is happening on your website from clickstream tracking. This is only the beginning of the possibilities for web evaluation. Drilling deep on behavior data and adding in outcomes and experience tracking can yield unexpected insights that can help your organization online and off.

Other Considerations for Your Web Evaluation Strategy

Kaushik makes many other useful, high-level recommendations for analytics strategies early in An Hour A Day. My favorites:

The 10/90 Rule – Spend 10 percent of your web evaluation budget on tools and 90 percent of it on people. Deriving meaningful insights from the mass of reporting noise requires a skilled mind, so make sure you find one (or cultivate one). The 10/90 rule is also a good check against spending too much money on a reporting solution. A standard hits/runs/errors report from even a paid analytics vendor may not be very useful. No piece of software exists that will deliver actionable suggestions on its own, without human involvement. Someone is going to have to do some work.

Understand the Reports – How does Google Analytics differentiate between a visitor and a unique visitor? How exactly does the online survey function? What determines how often it is displayed? You need to understand the methodology of each evaluative tool well enough to convince skeptical team members of its effectiveness. Be prepared to address concerns about data quality. Kaushik suggests that you “assume a level of comfort with the data” that is considerate of how it was collected. If your team decides that Google Analytics is an 80 percent honest representation of user behavior, then everyone will be on the same page.

Success will depend on communications goals – Goals for your website can vary widely depending on your organization’s focus. Look at Time on Site, for example. If you are an advocacy group engaging the public through video delivered by your website, a 10-minute average time on site is a fantastic stat. If your site aims to rapidly deliver information to reporters on deadline, it could be a terrible one.

It’s best to learn to fish – Or, at least, to learn to deal with the fish after they’ve arrived. Usability and web analytics professionals are expensive. The best strategy for small and medium size organizations is often to develop an understanding of the online tools available and apply those tools themselves with little outside support. This avoids the expense of an outside consultant and with the many free and cheap tools available, follows the 10/90 rule. If a consultant is brought in, make sure the solution they provide is useful beyond the length of their contract. Don’t set up a single web evaluation. Instead, develop a rolling web evaluation protocol that delivers actionable recommendations — ones you interpret yourself — several times a year.

Additional Resources – Avinash Kaushik’s books Web Analytics: An Hour a Day and Web Analytics 2.0: The Art of Online Accountability and Science of Customer Centricity are both packed with guidance on how to get the most out of your web evaluation, and his Occam’s Razor blog is a great place to search if you’re stumped by something.

In my next column, I will review several free and affordable tools that can help your organization launch its own trinity evaluation strategy.


Louie Herr writesconsults, runs, and produces audio. He is based in Portland, Ore.

 

 

Leave a Reply