How to Carry Out an A/B Test

A/B testing is a simple enough concept but one that doesn’t quite get the traction that it deserves. Knowledgeable marketers and designers will utilize this practice as it provides valuable insights into visitor behavior and it can lead to an increase in conversion rates.

However, even with the marked benefits that this approach has, many businesses avoid (or don’t understand) how to conduct A/B testing successfully. People aren’t properly aware of the process and more importantly they simply don’t know how to use it to their advantage.

In an effort to help you all and to galvanise your marketing efforts, here’s a quick guide and overview to A/B testing.

 

A/B testing is…

Let’s begin with a definition so that we’re all on the same page. Effectively you take two versions of an action with some sort of metric to monitor success. Then you simply determine which version is better.

To gain results you rigorously experiment with both versions at the same time. Using your metric you then judge which one functioned better and then apply that version in the real world.

Online A/B testing isn’t that dissimilar to experiments that you most likely conducted in science class as a kid. You’re attempting to determine the voracity of a specific approach and you’re judging it in relation to another and its success (or lack of).

Effectively the A could be read as the control sample and B is a variant. You then determine a way of measuring both approaches and defining which one works best. You will need a fairly large ‘sample’ size in order to carry out a test effectively and you will need to leave it running for at least a month.

a:b_testing

What to test

An effective A/B test works best if you test one small variable at a time. This can be a variety of things on your home page, landing page or any other page you want to test.

Such as:

  • Color – test one color off against another to see which encourages action. This may sound like it won’t be effective, but it’s surprising how color affects consumer decisions, so don’t dismiss it.
  • Buttons – the color, shape, or placement of an action button can be tested
  • Navigation elements
  • Forms
  • Images
  • Headlines and content
  • Layouts

These days, there’s no real need to get into the maths of it all. You can find software to help you carry out split testing (you can also try GetResponse A/B testing), which makes life much easier than it used to be for this sort of thing.

If you’re testing something based on an action – such as signups – then the amount of signups in this example can be your metric for success. So depending on which sample group signs up the most you’ll then find the more successful variant of your new website and have an informed reason for choosing it.

This is of course a relatively simplistic example but it shows just how easy A/B testing can be. The outcome is a better website and better audience engagement and retainment rates and it’s not really a costly undertaking in terms of time or finances. Effectively it works like a test audience for a movie. You can learn what works and what doesn’t from the people that’ll be using your product or service before that product or service goes live.

 

The testing process

Now that you understand the basic principles of A/B testing it’s time to determine what part of your business you’d like to test. This is perhaps a harder task than it seems as not every business will have new approaches that it wants to put under the microscope.

However, now is as good a time as any to consider your business and challenge any areas that may have become stagnant. Whatever you decide to test though, is contingent on your business and its goals.

An A/B test has a very specific procedure and a clear outcome. Whatever the business process that you need to test an A/B test functions as an advanced version of writing out a list of pros and cons. But it is based on the information and the context that you provide for the test. So be as clear as possible and define exactly what it is that you want to test if you want the process to be successful.

 

An example

Recently, a partner of mine carried out an A/B test on their site navigation. This was done with the thought in mind that often, when given too many choices, people don’t choose at all and simply leave. This was proven in the ‘jam test’, which if you’re not familiar with, you can check it out here.

When it comes to navigation, even the smallest of tweaks can and will have far reaching results. The menu is generally the first place that we look when we arrive on a site as we’re looking to get where we want to go immediately.

This is something that’s discussed widely in UX (user experience) design. Not only should navigation signals be positioned in the area that the user expects, but it should also make use of color and text in order to ensure that it gives the best experience. Something as small as changing the button text to something ‘friendlier’ can have an effect, as found by this test in which an alteration garnered a 47.7% rise in clicks – certainly not to be sniffed at.

In Power Admin’s case, the existing navigation included a whole lot of textual links, as you can see from the image below.

 

PA1

In this design, there’s a lot of text, which could prove too much choice for site visitors, consequently prompting them to leave. With this in mind, the guys decided to change the menu and carry out A/B testing to see if it made any difference to conversions.

 

PA2

So the new menu looks much cleaner and of course, there is a lot less in the way of choice. OK, there are still drop down menus, but this is something that can be addressed if the test proved successful if necessary, or of course a further split test can be carried out.

 

The Results

After test running for a month and a half, it was found that out of more than 10,000 visits, the site gained 12.3% more conversions from those who were presented with the single line menu. This is despite the fact that the new page actually received fewer visits than the old page.

This translates into around 70-80 more conversions each month – also not a number to be sniffed at. Let’s face it, most of us would be happy enough with improving our conversion rate by that much in each month.

 

control-vs-new-menu-conversion-rates-newchart

These results are encouraging and demonstrate very well how one element on a page can make all of the difference to your site and more importantly, your business. Whilst many people are quite daunted by running a split test, this illustrates how even the simplest thing can be altered and measured in order to deliver actionable results.

Further to that, it’s also an excellent example of keeping it simple, which is one of the most important aspects of testing. Too many people think that since they’re going to the trouble of setting up a test, they may as well test everything in sight – don’t do it folks.

 

Next steps – Your A/B test

Hopefully now you’ve realised that A/B testing isn’t quite the chore that it may have seemed to be. In fact it’s a very useful process and one that can pay big dividends. Knowing your audience, its behaviors, and the way that it relates to your business is incredibly important.

Make sure that you’re testing the control at every point and that you know exactly what your metric for success is before beginning. If you get lost or stuck there’s a very detailed, long, and considered article from Paras Chopra of Smashing Magazine. It’s worth a read and should answer any burning questions you have and perhaps even ones you’ve yet to ask. GetResponse has also prepared a great guide to A/B testing.

A/B testing can have marked benefits for a business and its infrastructures. Determine what you want to test then give it a go. Your audience knows what it wants so give them a choice and remember to listen and act on their opinions.

 

  • Daniel

    That’s a very good introduction to A/B test ! Added to bookmarks and shared.
    Thanks a lot for the quality of your posts.

  • http://markitwrite.com/ Kerry Butters

    Thanks for reading and sharing Daniel, much appreciated :)