If running a website without analytics is like driving blind, then running a website without a/b testing is like cooking blind – you’re not going to drive off any cliffs, but you won’t know whether you’re about to make the best meal of your life, or the worst.
A/B testing will help you to help your visitors better accomplish their goals.
It takes the guesswork out of design and copy decisions and helps insulate your work from the omnipresent confirmation bias. Your early tests are likely to become one of your most high leverage growth activities.
Follow the guide below and you’ll have a list of experiments to run, the tools to build the test and validate your results, and a good method for documenting your results and engaging your team – in 2 hours.
This guide assumes a very basic understanding of A/B testing.
1. Build Your A/B Testing Spreadsheet
When it comes to website optimization, intuition is often wrong, which is why documenting the success and failures of your a/b tests is important.
Managing your tests privately is like hoarding your favorite books. Don’t do that. Instead, lend them out and your team can set better defaults for design and copy as they learn what works and what doesn’t. It will also get you and your team thinking about the actual goals of the website, the micro (newsletter signups) and macro goals (produce purchases) of each page, and how they do – or do not – align with your business objectives.
To get started, download this A/B testing spreadsheet as a template for gathering and prioritizing experiments, recording the results, and finally evaluating your results using the chi-square test calculator. Simply input the number of trials and successes for each experiment and it will let you know whether the observed effect (e.g. change in conversion rate) of your treatment (e.g. change in size of CTA button) is not due to chance at (95% confidence); in other words, it will let you know whether you should trust your findings. This is called rejecting the null hypothesis.
Next, host the spreadsheet on Drive and invite your team. Not only will it get people excited about A/B testing, it will keep ideas out of your inbox and on the spreadsheet.
Here are the values I find to work well for each column:
A. Status [Queued, Running, Completed]: Show people what has worked in the past and when their experiments are up and running.
B. Page [Ex: /company/about/webinars/product/checkout/]: The local path of the page where you will run the test.
C. Element [Ex: H1/Copy, Button/Color, Webform/Fields, etc.]: Describe the element and property to be manipulated, briefly. I structured this like a file tree so that it’s easy to sort and search by element and the property.
D. Traffic (by day) [ex: 10,000, 500, 90]: Number of visits per day
E. Days to Complete [4, 21, 500]: Use this to prioritize your experiments.
Here’s how: using Traffic (by day), head over to ExperimentCalculator.com to estimate how long the experiment will take. You’ll need to estimate a conversion baseline (your conversion rate) for the page, and your expected improvement (from your baseline). You could use real data from your web analytics software or take a conservative guess.
F. Results [Winner: Variation #1 95% Lift in CTR]: List the result of your experiment as what goal was influenced and by what percentage. Documenting your experiments with screenshots is helpful.
2. Identify Your Top 5 Pages to Test
Identify your 5 most visited pages. You’ll use these for your first round of tests.
Why? Traffic matters. Visits per day is one large factor in determining how long your experiment will need to run for you to know whether your experiment led to a significant difference.
For example, say you’re wondering whether a blue call-to-action would outperform your current grey call-to-action. You could test it on /about, which receives 150 visits per day; or, you could test it on /blog, which receives 900 visits per day.
Here’s why you run the test on /blog:
3. Calculate the Biggest Changes to Test
Determining the best test to run depends on your website and its goals. For example, a blog owner may want to increase time on site or page depth, whereas a consulting company may want to increase downloads of its industry report.
For your initial tests, start with goals such as clicks or pageviews over metrics such as time on page or percent of page scrolled. Experiments affecting the latter need to be controlled rigorously, and intent is often difficult to understand.
For example, if the average visitor’s time on Page X increases 30 seconds, is that necessarily a good thing? What if your Control Page was more efficient at delivering information, and people find the Treatment more confusing, and thus spend more time trying to make sense of it? Did your bounce rate increase?
So, start with the low-hanging fruit of action-based A/B testing.
A few good places to start:
a. Color: Is the color of your call-to-action the same color as an active item on the menu? Try making your CTA unique from other elements on the page.
b. Copy: Use personalization. Instead of ‘Click Here,’ add context: ‘Register Now,’ ‘Download Whitepaper,’ or ‘Listen to the MP3’. Almost never use ‘submit.’
c. Image: Consider testing icons on your CTAs that indicate what action the visitor is about to perform or the type of offer they will receive (a whitepaper, an mp3, a newsletter, etc.).
a. Label position: For familiar field types (first name, last name, address, etc.) It’s been shown that labels placed above their fields typically outperform those that are alongside of it. If you run an eCommerce site, you need to check out this report.
b. Inline validation: Do users have to submit information before getting feedback? If you require a visitor to submit their phone number in a certain format, inform them as they complete the form; don’t make them submit the form only to receive an error message.
c. Number of fields: For each field you remove you can expect a bump in your conversion rate. The downside: less information about your customers. Balance what fields you require for qualifying lead quality with your conversion rate goals.
a. Social Proof: Try adding testimonials and customer logos. You would be shocked by how many people fixate on these items.
b. Microcopy: Let a users know what exactly will happen after they submit a form. Remove any anxieties.
c. Branding: Make sure your landing page, form, or microsite feels like your website. Give you the visitor assurance by using your logo and sticking to the style (font, colors, etc.) that make the user feel at home on the rest of your site
4. Setup the test
You have two main options here: the free route, and the not-free route.
If you can easily create duplicate versions of the page you want to test, start with Google Content Experiments.
If you can’t easily create duplicate versions of the page you want to test and you want to be able to launch tests really quickly – and easily – then head straight for Optimizely and their $20/month bronze plan.
To avoid selection bias, before you begin your experiment, select either a target end-date or a target number of trials for when you will end your experiment. It’s all too tempting to stop an experiment when you see a 200% increase that you’re afraid will plummet at any moment.
Finally, test your page on different browsers. You can use a trial version of BrowserStack so you can test your page by browser and OS. What might look great on one browser, could look pretty terrible on another.
5. Analyse and verify the results
Whether you’re using Google Content Experiments or Optimizely, open the a/b testing spreadsheet, input your trials and successes, and the final row will output whether the effect is statistically significant.
Alternatively, you can use Evan’s Awesome A/B Tools, which are pretty awesome.
6. Implement and Share the Results
Record your results in the spreadsheet, send out a weekly recap, and implement your findings.
As a result, you’ll encourage people across your company to consider the purpose of their content and capture more value from each web visit.
Now that you have a few tests up and running, take the time to learn about the concepts and methodology behind A/B testing. Open Culture’s Math directory is a good place to start.