Blog

7 successful AB test experiments

Too many companies are still making decisions solely based on their gut feeling instead of on actionable insights. Don’t make a fool of yourself and make sure you draw the right conclusions based on valuable insights and data that speaks for itself. 

We covered the basics of A/B testing in our newsletter a few weeks ago. A/B testing is very briefly explained the comparison of two different variations of a website, mailing, etc. where you show 50% of your audience the original version and the other 50% the variation. You’ll then be able to decide what version was the most successful before making the final changes. 

You can put every small detail on your website through an A/B test, ranging from buttons to pictures and even your whole shopping cart. The most important thing to keep in mind is to only change one single thing at a time, this enables you to decide what actually had an impact. If you make multiple changes at once, it’s hard to decide what exactly had an impact.

Let’s take a closer look at some successful A/B test cases.

1. Swiss Gear’s product page test

Swiss Gear is a company that aims to make people’s life easy when they’re traveling by offering products with smart design, great performance, and style. 

Swiss Gear experimented with their product page to increase their conversions. The original version of the product page has a lot of red and black combined, which didn’t result in one specific part standing out from the rest of the page. Nothing specific catches your attention.

In their variation, they focused on highlighting the most important elements in red to make them stand out from the rest of the page. The ‘special price’ and ‘add to cart’ sections immediately catch your attention compared to the original version. It’s easy to find important information in the variant.

These small changes resulted in a rise in conversions of 52% and even a spectacular increase of 132% during the holiday season.

2. Sony VAIO’s banner test

Sony experimented with the banner of their VAIO laptops’ ad to see what works best for their campaign and which version had the highest shopping cart adds.

The results were even better than expected with the variant. It increased the CTR by 6% compared with the control version, and the shopping cart adds even increased by 21.3% Considering these numbers with a brand as big as Sony, it’s not hard to see how much value they generated by simply A/B testing their banner.

An interesting fact is that research conducted by Sony had their marketers believe that emphasizing customization would make customers turn down the offer because it might be seen as a potential obstacle during the purchasing process. However, A/B testing proved the hypotheses to be completely wrong based on relevant data. 


3. Codecademy’s pricing test 

Codecademy is an online interactive platform that offers free coding classes in 12 different programming languages. The free coding classes are mainly aimed at beginners, while the pro plan is better for people that want to get deeper into coding than just the basics. 

Codecademy experimented with their pricing display on their website with 2 different variations. 

They tested the ‘Rule of 100’ psychological principle on their pricing page. This rule suggests that customers see amounts over $100 as being greater in value than percentages, even though they’re both the same amount. Because the savings on their annual plan was above $100, they decided to give the A/B test a try by showing the number of dollars that would be saved by their customers rather than the percentage. 

This test resulted in an increase of 28% in annual pro plans and a small increase in overall page conversions.


4. Insightsquared’s form test

Insightsquared is a B2B company that aims to make companies’ sales forecasting easier. They experimented with their forms to improve their conversion rate and generate more leads and sales. 

The original form had some optional fields, which made the form look longer than it actually was and scared away a lot of visitors even though some fields were marked as optional. 

In their test form, the company decided to get rid of the optional fields to see if the conversion rate would increase because the form is less intimidating. At first, they determined what percentage of people left their phone numbers when completing a form. They noticed that only 15% of the users left their phone numbers and thus decided that this optional field could be removed.

The company decided to test the new form on their eBook form and noticed that it had an increase of 112% in conversions. 


5. Matalan’s search bar test

Matalan is a British fashion and homeware retailer with more than 230 stores in the UK and 50 international franchise stores within Europe, Africa, and the Middle East. Matalan wanted to optimize its navigation to make sure users could easily and quickly find the rights products in their extensive product range. 

The first test they ran, was to show the search bar to improve search engagement and increase mobile conversions. The hypothesis was that exposing the search bar would make it easier for visitors to find their desired products. This would then result in them expressing their intent, higher numbers of searches, and an increase in the overall conversion rate. They noticed a 32% increase in mobile searches and a 51% increase in searches on tablet searches with this A/B test.

The brand decided to take the test even further and experimented with moving the navigation bar to the bottom of their mobile pages to create an app-like feeling on their website. The hypothesis was that this would make it easier to reach the navigation bar with just one hand and use your thumb to interact with the screen.

However, the test revealed that users actually preferred the traditional position at the top of the screen. 

As you can see once again, it’s important to put your hypotheses to the test before making final decisions. 


6. Århus Teater’s call to action test 

Århus Teater is one of Denmark’s biggest and oldest theatres that wanted to experiment with the call to action button on its website to increase its ticket sale. 

The original call to action said “Køb Billet” which translates to “Buy Ticket” whereas the variation’s call to action said “Køb Billetter” which translates to “Buy Tickets”. 

By changing the call to action from “ticket” to “tickets” the theatre increased its ticket sales by 20%. A possible explanation could be that people rarely go to plays alone, they usually go as a couple or a larger group. The “Buy Tickets” message therefore might be more clear to them as it clearly stated that they could buy multiple tickets at once, or for different shows than the one that’s displayed on the banner at the moment they’re visiting the website.


7. Beckett Simonon’s webshop test

Beckett Simonon is an online brand that sells shoes and accessories of high quality at reasonable prices by only offering their products online. They aim to reduce waste and produce high-quality leather goods responsibly and ethically. 

The brand wanted to combine its ethical values while showcasing its products to see if it would increase its conversion rate. They noticed an increase of 5% when showcasing their values in between their products compared to the control version where they just showcased their products.

These are just a few examples of successful A/B tests and their results. There isn’t a golden playbook and what changes work, depends from audience to audience. The key take to keep in mind is to test your hypotheses before finally implementing them on your website to see if they make you reach your desired result.

social-media-facebook-1 social-media-twitter professional-network-linkedin

Deliver great experiences.

Request a demo