A/B testing is a term use to compare one design with another in order to maximize conversion rates. The method has been popularized by the marketing world for emails, ads or fliers to see what yields the best returns. Now, you can have more then two variations and can test anything, from font sizes to copy to background colors or images. It’s all up to you.
How Does It Work?
A/B testing works based on statistics. For example, an A/B test could have 100 people directed to page A and another 100 to page B to see whether a red or blue button color is a better choice. If only 30 percent click the button on page A where as 45 percent of people click on the button on B, then B outperforms A.
Understanding results is pretty easy. How to get significant results is the trick. In order for the concluding results to be valid, the sample size needs to be big enough to be significant. What does significant mean? It simply means that the sample size – number of visitors – is big enough so that the results were legitimate and not a product of chance.
To give you a basic idea, the higher you want your conversion rate to be the more visitors you need. So if you want your conversions of a button to be 10 percent, you need about 600 visitors in order for results to be significant. If you want the conversion to be significant at a 30 percent, rate you will need roughly 1,400 visits. By no means is this a norm but this gives you a general idea. The more time it takes you to get these visits and these clicks, the longer it will take you to get valid results. I think this is logical, but I’ve found this is often overlooked. If your website gets a lot of traffic, such as a site like Amazon, results will be quick to achieve. If your site gets maybe 50 views a day, it will take a little while.
Taking It Easy
A/B testing is a godsend when it comes to optimizing campaigns whether it’s a landing page or an email. However, there are a few things you should be aware of. First, the more variations you have the more time it will take for you to get results. It’s okay to try up to four variations of a page, but keep in mind that the results could take four times longer to yield significant results. Additionally, you should not test big variations. Don’t test a big redesign, look at small changes and only one element at a time. It’s going to be hard to determine what was the cause of the conversion change if you are moving things all over.
How to A/B Test With Google Analytics
Now that you know how AB testing works, it’s time to do some testing. There are a few services out there that allow you to perform these tests. There are paid options but Google Analytics is free and not too complicated to set up. I’m going to assume you have a Google Analytics account already, and if not this is the time to set one up.
For the purpose of this I will be redoing some copy on my portfolio’s home page and changing the background image.
With Slides, we don’t make you start from an empty slate. All you have to do is to pick the elements you like best and combine them. Each slide has been carefully crafted to satisfy three key criteria: aesthetic, function and usability. That way you know every element works together seamlessly while enhancing the impact of your content.Create a Website
Once you’re logged in to your analytics account, select the website you’d like to subject to a test. Once in the website’s dashboard, on the left side navigation find the Behavior tab and click to expand it. (It’s pretty far down, actually.) What you want to then click on is the Experiment link.
You’ll then be taken to a new page and it shouldn’t have much in it. It’s okay; we’ll add a new experiment – A/B test – now. To do so, click the “Create experiment” button.
Setting Up an Experiment
You will be taken to a new page where you should give your experiment a name.
There are various tests; you can see the impact changes have on bounce rate, on the number of page views, or see if people click a specific link. It’s all up to you. You can also check multiple things at once. You do all of that in the objective for the experiment section. Create a custom objective to watch out for unique data.
Going back to the content essentials, you can also see that you can set the amount of traffic that will be affected by your test. When you have 100 percent of your traffic selected, this means what you may be assuming that all of your incoming traffic will be subject to the A/B test. Of course, this gives you the quickest results. When you have only 50 percent, half of your traffic will be subject to the A/B test while the other half will see your original page at all times.
Configure the Experiment
Once you’ve got your objectives in order, it’s time to determine which pages will be tested. You’ll need to input the original webpage and also provide the URL for the alteration. For you to run a test via Google Analytics, you need to have separate pages that are the same but with the test changes implemented as your visitors will be driven. Add as many variations as you need.
The last thing you need to do is add a Google Analytics code snippet to both of the pages in order for GA to recognize the pages. Unfortunately Google doesn’t know everything and just because you direct it to two pages doesn’t mean it will start recording analytics.
Each of these snippets is specific to your test. Follow the directions if you need help but basically you just pace it after the opening <head> tag. And you’re good to go.
Once you receive a confirmation that Google Analyics is connecting with the code, you’ll be able to start the experiment.
Give your test a little bit of time but you’ll be able to see results as people visit the page almost right away. (You might have to wait 24 hours for Google Analytics to set itself up.) Check back to see how the test is doing. Frequent monitoring can help you spot technical issues with the test. If one page is getting awesome results and the other isn’t something might be wrong with the way the test has been implemented. Or if both pages are doing extremely well then the objectives may have been poorly set up. Keep monitoring the results and you’ll see how much longer you need to wait for significant results. Happy testing!