When people talk about increasing conversion rates in e-commerce, AB testing is one of the tools which come most readily to mind.
The method is easy to apply, and AB testing allows us to understand consumers’ preferences and what triggers an increase in conversion.
You can experiment with the text of a Call-to-Action button, on an image of the store’s home page in the product page diagram; in fact, any change to a point of interaction with the public will affect sales.
But is it really worth testing your e-commerce pages? Has this produced concrete results for other companies?
This article will answer these questions. We will talk about the importance of A/B testing and give 8 real examples of the application of these experiments. Read on!
#The importance of A/B testing
Have you ever heard of a page on a website or online store with a 100% conversion rate? You can search all you want, but you won’t find a single example of perfection.
As long as this is the case, AB testing will always be necessary. Because the conversion rate can always get better.
It’s all a question of understanding what it is that makes the shopping experience more enjoyable for the customer. And it is the customers themselves who supply the answer, because the results of the tests show how visitors to your website behave.
And this is what makes AB testing so efficient. Samples are taken of similar or identical numbers of real people for simultaneous testing. This isn’t a matter of guesswork, it’s an empirical experiment.
Since we’re talking about real data, here are some statistics to prove the efficiency and popularity of these tests. The figures have been taken from the Conversion Rate Optimization Report 2015, issued jointly by Econsultancy and RedEye. Just look:
58% of the companies were already using A/B testing as a way of improving their conversion rates;
35% were planning to use this tool in the near future (considering the date of the report, many of them must be doing so already);
only 1% of the companies did not believe that AB testing would help them improve their conversion rate;
63% of the companies thought that AB testing would be easy to do.
#8 real cases of the use of AB testing
If these figures give a good idea of how important AB testing is, now it’s time to reinforce the argument using real cases where the method has been applied.
We have selected 8 examples, as follows:
#1. A change in the layout of the store window led to a 22% increase in conversion
The home page of the official store of the Vancouver Winter Olympics was the subject of AB testing. In version A, in addition to the principal offers occupying the major part of the screen, there was a sidebar with highlighted products. Version B did not have this sidebar.
The result was as follows: version A had a conversion rate 22% higher than version B.
This highlighting of products was helpful to shoppers who needed suggestions as to which offers to purchase.
#2. More than twice as many calls from an announcement linked to a PPC campaign
A test on the website of California Closets compared two models of text for a landing page.
Version A had a more specific announcement that supplemented the text of the advert which had brought visitors to the page. Version B, on the other hand, had more general wording, intended for anyone seeing the page.
Because version A was more segmented and bore a closer relation to the advert, it resulted in 115% more phone calls than version B — this was the metric analyzed in the test.
#3. Gain of 34% in sales when customer testimonials were shown
This test was carried out on the WikiJob website, with the aim of checking the impact that other customers’ comments would have on sales.
Version A was tested, without testimonials, and version B including them.
The result was as expected: a 34% increase in sales with version B. This shows how important the comments of other people are for consumers.
#4. Increase of 47% in revenue per visitor simply by removing the zeros from the price
In a test done for a jewelry store, a comparison was made between a product page showing the cents, with zeros, and another page without these zeros.
Version A showed $19 and version B $19.00.
The agency team that conducted this experiment expected that with the price shown in this way, purchasers of jewelry would think that the products cost less.
And they were right! Revenue per visitor to the site went up 47%. On top of this, there was a 9.7% increase in additions to the shopping carts.
#5. A change in the text of a CTA resulted in a 20% increase in the click rate
Remarketing campaigns are very common in e-commerce, as you know. So A/B testing can be applied to these strategies too.
To give you an idea, we are showing an experiment carried out during a campaign by the telephone operator Vodafone.
The test centered on an email about recovering abandoned shopping carts. In version A, the CTA button text was “Go back to Vodafone.nl”; in version B, “Complete order”.
Version B, which was more objective (encouraging shoppers to go back to where they left off), resulted in a 20% increase in the click rate.
#6. More than 13% increase in sales from a banner with fewer images
In an experiment on the home page of Tafford e-commerce, people’s preference for the style of images was tested.
In version A, there was a banner with two pictures of clothing being promoted, while the other two squares of the same banner had no pictures of clothing. In version B, there were pictures of items of clothing in each square of the banner.
Version A showed the best results: + 13% in sales. A cleaner image may have led people to take this step.
#7. A change in the background of a banner gave a 30% increase in clicks
In another similar case, this time in Frontlineshop, the element altered was the background of a picture highlighting a shoe.
In version A, the background was black; in version B it was white. The first model was the winner, with 30% more clicks than the second one.
In addition to the contrast favoring the product, the profile of the shoppers was of a preference for the color black, which gave a feeling of sophistication to the trainers.
#8. Shortening the route for visitors gave better results for a fashion store
The Express online store carried out an experiment to test two models of home pages for its website.
Version A showed a woman on the home page, a CTA for checking out offers for women and a menu with the other categories.
Version B opted for segmentation. With a simpler layout, the screen was divided into two: a woman on one side and a man on the other. The ideas was to segment hits on the home page, allowing each visitor to choose a path to follow.
Although the latter layout seems to make most sense, it was version A that got the better results. Version B, which offered segmentation as a first step, resulted in more than 10% fewer orders.
But why was this so? The answer may be in the fact that people want the shortest route to a purchase. Every obstacle placed in their way can discourage them.
The statistics and these real cases show what an efficient method A/B testing is for e-commerce. If we assume that conversions can always improve a little, and that each element shown on a screen makes a difference to customers’ decisions, investing in this type of experiment is very worthwhile. Finally, if you do not test, you will never know what influences visitors to your store to convert.
So please let other people know how important A/B testing has been for other e-commerce sites, and share this article on your social networks!