You only need to look at the graph below from Google Trends to realize how important and common A/B testing has become for businesses globally since 2007. Its popularity is expected to increase even further as more companies experience higher conversion rates through A/B testing.
But why is this so? Is A/B testing something new? The answer is “no”, but the reason behind this surge in popularity is likely to be found in the democratization of A/B testing.
Back in the day, A/B testing was very IT intensive and basically only people with a developer background were able to do it. Over the last years, A/B testing has become easier to do and thus become more accessible to the masses. It is by no means something that only developers are capable of doing, today mere marketing mortals can do this too. Definitely one reason why this graph has developed like this.
Why should you run A/B tests?
No website is perfect. Also, no two websites are identical. One of the recommendations commonly given when people talk about testing a website is, “Do a usability test.”
Usability tests in combination with the analysis of your website data offer tremendous value. Knowing where the weaknesses of your site lie and where users struggle to take the action you ant them to (click a button, fill in an e-mail address or buy a product) is a great start, but your testing efforts shouldn’t stop here.
Let’s say you are running an e-commerce platform and you analyzed click through rate on your sign-up button. High hopes, but the result is disappointing: Engagement on the page is pretty good, people do click a round, but very few people click to sign up. Also, when you do user testing, people seem to have a hard time signing up. Now you have not one, but two indications that this part of your website is not optimized!
How can you improve this weak spot and lift your click through rate?
The results from usability tests or analyzing website data can help you create a hypothesis. Following with the example above, your hypothesis in this particular case could be:
“If I made the call to action (CTA) more clear, then more people would sign up for the service I am offering.”
If you read usability blogs or look at results from usability tests, this is already a win, since there is still too many websites out there who don’t do any of this.
But remember: No two sites are identical. What works for a big international e-commerce platform with 10 million visitors a month might not work for a startup that has merely 3,000 visitors each month.
Audiences are different, the set-up of your pages is different, your conversion funnel is different, the content on your side is different – thus copying other sites might work for some aspects, generally I would not recommend it. The only way to confirm or reject your hypothesis is by testing it. Much talk about running A/B tests, what were they again?
What is an A/B Test? Remind me, please.
Most people will already be familiar with the general concept of A/B testing. An A/B test, in its most simple fashion, let’s you test the original (version A) of your page against a modified version (B).
In version B of your website, you make a change depending on the hypothesis you would like to test. For example, if your call to action did not lead to sign-up, you might want to change this element first.
In order to test one variation against the other, your traffic would be split at random so that 50% of your users would see the original of the page when visiting your site, while the other half sees the modified version.
Here is an example using the hypothesis above: A website wanted to test its “Start selling” section.
Original (version A):
The team working on this website had established a clear goal before conducting a test: It wanted to increase the number of people that started selling on their site. More sellers meant higher commissions = more money.
Then the team created a version B of its website where there was only one clear call to action instead of multiple like before. The new design was more visual, less cluttered and less text-heavy and thus more enticing to new users.
So guess which one performed better?
After having set the test live, the team saw a staggering result. Version B had attracted more clicks and the team was happy to see an increase of 606% of people that clicked the “Start Selling” button. A huge success. The team had found a solution to its problem of low sign-ups and could now implement the new design and take full advantage of the increase in conversions.
What would have happened without an A/B test?
Without an A/B test, the website team would have run the risk of building a second version of the website that performed equally weak or only slightly better than the original. This is going back in time when nobody was able to do A/B tests because they were so difficult to run.
There would have most likely been 10 different individual opinions on which design to choose to improve sign-ups. People would have based their design decisions on gut feeling and experience, instead of hard data just like the team above who tested the two different versions.
Peter Drucker is often quoted with the famous sentence;
“If you can’t measure it, you can’t improve it”
A/B tests help you do exactly that! Testing your site helps you unlock the conversion potential by listening to the data, and not individual opinions. By doing so, you automatically get closer to your primary goal: Improving the usability of your website.
What about you? Have already run A/B tests on your website? Let us know what results you got by leaving a comment.
Fabian Liebig is in charge of marketing at Optimizely DACH and enjoys telling stories about how to base decisions on data. Optimizely was founded with the vision to make A/B testing accessible to everybody, technical and non-technical people alike and has become the most installed A/B testing software in the world.