Using Comparison Testing to Improve Website Designs
What It Is
Comparison Testing, also known as A/B or Split Testing, allows you to incrementally improve your website design without doing a full-scale redesign. Some commercial websites do comparison tests on multiple versions of their Web pages simultaneously. But the easiest, least expensive way to comparison test is to make small, incremental changes to a single view.
Why It's Important
Most of us have experienced a major redesign effort: years of working with in-house or vendor design companies, endless testing and refinement, gargantuan work plans, and multi-year implementation periods.
Mammoth rebuilds require significant time and money and should be done rarely—only when you have data that justifies a complete overhaul. Evolving websites with continuous refinements over time is cheaper, easier and more likely to succeed. Doing a "night and day" redesign can be extraordinarily risky, and can extract quite a toll if done wrong.
Example: USA.gov Comparison Test
During recent comparison testing of the USA.gov home page, staff used "heat–mapping" software to get baseline metrics on the Web pages and test incremental changes that would result from altering the page design. There are many types of heat–mapping software, including: CrazyEgg, Attention Wizard, or ClickDensity. Many have a free trial option.
The USA.gov staff also used WebTrends data to identify top tasks by way of the top search terms and most visited links.
If you plan to use Crazy Egg regularly, you must ask the Crazy Egg staff to disable persistent cookies. As a result of this change, the Crazy Egg test results will not distinguish between first time and repeat visitors, but the data you receive on where users click on your website will still be tremendously useful. This holds true for many other heat–mapping companies as well.
How to Implement
The basics of comparison testing are simple:
- Use Web analytics data such as search terms and most visited links to identify a top task you'd like to improve;
- Don't spend time refining secondary tasks until you know your top tasks are fully optimized.
- Examples of improvements include repositioning a well–trafficked feature; improving a link title; redesigning the A–Z Index; or moving and enlarging the search box.
- Identify measurable goals such as "Traffic for the FAQ link will increase 15 percent."
- Coordinate your test dates with the rest of the Web team. Avoid making other changes on the Web page you're testing for the duration of the comparison test.
- Run a baseline test of the Web page to gather metrics (clicks, visits, etc.) of the page as it currently exists.
- Change one element on the page. You may be tempted to test two or three features at a time; don't do it. You won't know how to attribute the results.
- Run a second test of the same duration, controlling all the other variables except the one you want to test.
- Gather the same metrics to compare performance.
- Compare and analyze the results; assess against goals.
- Crazy Egg offers several views of the page traffic:
- Heatmap: First, we looked at the Heatmap view for an overall traffic picture. Heatmaps are graphical representations of your site traffic (and they're also kind of beautiful too.)
- Confetti: Then we looked at Confetti, which allowed us to drill into additional data on search terms, types of browser, window size, etc.
- List: Finally, we looked at the List view, which displays the traffic to page elements in percentages, from most visited to least visited. We used List data to determine our results and compare them to our numerical goals.
- If the change has been successful, make it permanent
- Identify the follow–up tests you want to conduct
- Examples of follow–up tests include removing a redundant link, and base lining a related page deeper in the site
- Conduct follow–up tests, and repeat the process