The biggest reason why businesses run online campaigns is to earn conversions. The average conversion rate (the rate at which users turn into customers) today is around 1-2%. It is highly likely that the remaining 98% users are turning away from your site. How do you make them stay and convert?
A/B testing is the popular solution.
Before we emphasize on its definition, keep in mind that Barack Obama raised $60 Million in his presidential campaign, by running a simple experiment. Both Barack Obama and Mitt Romney had set up dedicated teams to A/B test the 2012 Presidential race. However, Obama’s teams were performing so many tests on their website that the Republicans were completely blindsided.
What is A/B Testing?
A/B testing, also known as split testing, is a method of comparing two versions of a web page to see which one performs better. You will have page A and page B, both will be posed in front of the same audience. The page with higher conversion rates will be successful. Quoting Obama’s 2012 campaign example, the website was designed to look like this:
There were multiple tests in the form of ‘media’ and ‘sign up’ buttons. After subjecting these tests to 310,382 people, the one page that converted the most was this one:
The page with the highest conversion rate was the ‘sign up’ page. The conversion rate kept changing throughout the course of the campaign, but Obama’s teams were ever ready to adapt to changes. This led to massive political success and seated Barack Obama as the President.
A/B testing can be just as useful for business as for politics, because researchers can segment the population and assign different treatments at random. When the number of cases (N) is very large, even very small differences can be detected. And in business as in politics, a small increment can make all the difference to success.
A/B testing on the web is similar. You have two designs of a web page: A and B. Page A is mostly the existing page, and page B is the new design you wish to test. You split your website traffic and show them different versions of these pages to notice the change in metrics (conversion rate, bounce rate, sales, etc).In the end, you can choose the page that performs best.
What do you test?
Almost anything on your website that can affect user behavior or increase user engagement can be measured for better performance. However, your choice of what to test depends on your goals. If you are receiving complaints about a part of the site, maybe you want to test a better tool. Maybe you want to increase sign-ups or try a new button. All these questions can be answered by running some tests.
In the past, marketing departments hesitated in running tests because the process was costly. Today, IT has evolved and teams can run these tests without waiting for the tech guys to interfere and analyze the results. Thus, A/B testing is becoming a powerful tool in marketing departments.
The best way to run A/B tests is to follow a scientific process:
Collect website data
Use a website analytics tool, like Google Analytics, and determine the problem in your conversion funnel. For example, you may be able to identify the page with the highest bounce rate. Analytics provide insights into which tool needs optimizing. Let’s say your home page turns out to have a high bounce rate.
Upon gaining insights from the user behavior analysis, build a hypothesis that aims at increasing conversions. The hypothesis should state why a certain condition will be better than your current condition. Once you have a list of ideas, note them down in terms of expected impact and difficulty in implementation.
Using an A/B testing software, make changes to that element of your website. You might be changing the color of a button, changing the text in product descriptions, swapping the order of your page, making navigation changes or customizing a tool. Many A/B testing softwares now have a visual editor that makes it easy to preview changes as compared to the current editions. Don’t forget to QA your test to make sure it goes just as you expect.
Now kick off the experiment and wait for users to engage on your website. At this point, any user who visits your website will be randomly chosen to either see the controls or variation of your pages. This particular interaction will be measured, counted and compared to assess the success of each change.
For example, in the Obama campaign, they had four buttons and six different media that meant 24 combinations were there to test. Each visitor was shown a different combination and their movement was tracked.
Analyze the Results
Analyze the A/B test results and see which variation delivered the highest conversion. If there is a clear winner in terms of results, go ahead and implement it. If the test remains confusing and no prominent changes occur, go back to formulating hypothesis.
Following is the result obtained after the Obama campaign. It was clear that the sign-up button was converting the most, so it was implemented.
SEO and A/B testing:
Google has permitted and encouraged A/B testing and keeps performing A/B tests itself. Google announced that A/B tests pose no threat to SEO of a website. The following, however, could create some problems.Cloaking: Cloaking is a wrong practice of showing users a different thing and showing search engines a different version of the page. Google can demote a website if there are hints of cloaking in it. Do not jeopardize your ranking by abusing A/B testing tools.
Use 302 redirects, instead of 301: If you are running a test that redirects the original URL to a variation URL, use a 302 redirect. 302 redirect is temporary, while 301 redirect is permanent. This tells search engines that the redirect is temporary and will be removed after testing. Google will index the original URL, rather than using the temporary one.
Run experiments as long as necessary: If you are showing a variation of a web page to a large audience for longer than necessary, it can be seen as an attempt to fool the search engines. Google suggests updating the site as soon as the test is done, and then removing the variation URL from index.
Use rel=”canonical”: If you run a test with multiple URLs, use the rel=”canonical” attribute. Google will not be confused.
A/B testing mistakes:
Even the experts make mistakes. Here are some common mistakes in A/B testing that don’t earn conversions:
Not testing everything: One of your biggest mistakes will be to not test each and every change that is made on your site. Sometimes the boss wants to make a change in the website, and he doesn’t care about conversions at all. He believes that the feature will revolutionize business. It may or may not benefit your business in the end.If the whole organization is on the same page to understand the importance of change, then the testing method will not be a hurdle, and will earn conversions and feedback at every change.
Testing mini conversions: It is necessary to determine the end goal and measure conversions with respect to that. If you won’t put thought in this process, you may waste energy, time and money in measuring micro conversions that don’t affect the end goal. For example, measuring link clicks on a form signup on a community help website, but not paying attention to how many people actually fill the form.
Having too many variables: It is okay to test too many variables at once, but sometimes you won’t be able to perceive the distinguishing change in variables. For example, you change the headline, layout, form and introductory video all at once, resulting in confusion as to what is driving conversions better.
Expecting to win big with one small test: One fairly common mistake is to think that by having minor copy changes, a major conversion change will occur. It is not possible. The change will get you a bit closer to success, but can’t win you a huge conversion percentage.
Ending the test too soon: Most professionals suggest running a test for at least 7 days. This is a very basic rule of the game. Or, you can run the tests for as long as it takes the site to hit 100% conversion. The problem is that people find it hard to wait. You might think that the change will do nothing, and quit too soon. If you quit too soon, you will lose any effort you put earlier. Peep Laja from ConversionXL wrote about a test where the challenging variation decreased conversions by 89.5% after two days of testing with a 0% chance of winning. At this point, the client was ready to call it quits, but Peep recommended running for a bit longer. Eventually, 10 days later, the challenger was increasing conversions 25.18% with a 95% chance of winning. (Quicksprout)
The power of A/B testing:
Today it is possible to grow your business by performing some simple tests on your website. You can grow your user base, increase conversions, develop a product that helps people and have an effective strategy. The tests make a company aware of their customers’ needs and helps them adopt better marketing strategies. A/B testing is a very efficient and cost-effective way to monitor your web presence, and any marketing department shouldn’t work without it.
Have you performed an A/B test before? Do you have any tips on using A/B testing effectively? Leave your responses in the comments below.
At SEtalks, we convince search engines to talk about our client’s businesses. Via our upbeat strategies and innovative techniques, we strengthen the online presence for businesses. We work on the motto “We win only when our clients do.”