Nov 23, 2017
Build Better A/B Tests Using Your Customer Data
by Digital Marketing Institute
A/B tests provide the data you need to make strong business decisions that will increase engagement, sales and user loyalty.
In this article, we will outline why you should be A/B testing as a norm, explain how you can implement it and show you some examples of companies who found success by changing a simple thing after testing.
Why should you A/B test?
Simply put, A/B testing is good practice and just makes sense.
As a business, you wouldn’t put a product on the marketing without testing its configurations or uses. There is no reason that your digital streams should be any different.
The other point is that A/B testing (sometimes known as split or bucket testing) is relatively easy.
As Jeff Grundy of MightyCall says:
In a nutshell, A/B testing is little more than experimenting with two types or variations of a landing page, ad text, a headline, call-to-action form or just about any other element on a website. Sometimes called split-testing, A/B testing allows you to create and display two variations of your page and/or its content to see which one attracts more interaction and conversions from your site visitors.
Grundy says that good A//B testing can lead to improved content engagement, easier analysis, higher and more valuable conversions as well as reduced cart abandonment rates.
Simple tips to optimise your split tests
Your testing is aimed at forming habits for your customers or users. Likewise, testing should become a habit for you and your staff. Here are some tips to help you.
1. Don’t assume anything
Going into testing, have an open mind. Don’t assume that because something works for another company it will automatically work for yours. Have an open mind and be prepared to be surprised. For example, sometimes a more text-heavy copy will outperform a clear, concise one, despite what you’ve been taught.
While what you know or have learned should be put into practice, it shouldn’t be assumed that it is bulletproof.
2. Ask qualitative questions
While understanding data is part of what we do as digital marketers, A/B testing is about understanding specific problems.
Survey companies such as Qualaroo specialise in asking simple questions of your users, such as “what problems are you having?” on websites. You can also ask qualitative questions on landing pages. Asking customers what issues face them when trying to clean the house can lead you to know which products should be front and centre on your website.
3. Be statistically significant
A/B testing, like any experiment, is only as good as the methods it uses. For your test to have any validity, it should have a large enough sample size and must reach statistical significance. Without a large enough sample size, you cannot be confident that your results will be replicated.
Your tests must also reach statistical significance. To work this out, tools such as this KissMetrics one, will let you know.
Case studies
While it is all well and good telling you that A/B testing works, there are many real world examples of how it can work.
Here are just a few examples.
Discovery
Back in 2015, TV giant Discovery wanted to drive clicks and overall views on some of its video content.
So Jeffrey Douglas, Director of Product at Discovery Digital Networks, came up with an idea that echoed one of the company’s big names. The experiment was called the “Ken Burns Effect”.
Burns is a renowned documentary maker whose material predates film. To give his films a cinematic feel, he often pans across still images. The Discovery team decided that when a user hovered over an image, the same would happen.
The results, Douglas told Optimizely, were impressive. After testing the system on 20,000 visitors, the company saw a 6% increase on video clickthroughs and an increase in ad viewability.
Douglas said that the company wanted to become a place that builds a backlog of A/B experiments.
“Ultimately, we want to be in a place where we’re making sure the developers are able to focus on work that has the highest potential for impact for our company.”
Our product team can fail and make mistakes that are inexpensive versus very expensive mistakes that end up using lots of resources.
“Everything from QA to deploying code and so on. We can prevent developers from making changes that won’t ultimately matter… that’s really the kind of culture that we’re building here.”
Humana
Health insurance provider Humana found huge success with what appears to be a simple couple of changes.
Health insurance provider Humana found huge success with what appears to be a simple couple of changes.
The company tested two banners on its website. The first featured quite a bit of text, but promised users a large saving on prescriptions. However, when the copy was cleaned up, the picture changed and the headline pared down, clickthroughs went up 433%. Yes, 433%.
The company’s head of digital test and learn Mike Loveridge wasn’t finished, however.
He told Marketing Experiments that the company kept testing and refining the banner. A further change to the call to action’s microcopy – from “Shop 2014 Medicare Plans” to “Get Started Now”, saw an increase in clickthroughs of a further 192%.
Server Density
Hosted server company Server Density used an A/B test to aid in the changing of its pricing structure.
The original page saw customers presented with costs, whereas the experimental page focused more on the values offered.
The company saw two things: a significant drop off (25%) in the number of free signups, lowering the number of people who use the product for a trial phase and a 114% increase in total revenue.
Barack Obama
An older, but extremely important, example of A/B testing in action surrounds the 44th President of the United States.
During the 2008 Presidential campaign, Obama’s digital team tested a number of signup buttons and media.
Before we ran the experiment, the campaign staff heavily favoured “Sam’s Video”. Had we not run this experiment, we would have very likely used that video on the splash page. That would have been a huge mistake since it turns out that all of the videos did worse than all of the images.
Groove
Support software firm Groove took a strategy we have previously discussed to overhauling their landing page: they asked questions. Frustrated with a low conversion rate on a product they were happy with, the team spoke to customers on the phone and asked new signups why they had done so through an autoresponder.
They used the results to write the landing page copy using their customers’ own words and used that copy to inform how they redesigned the page. According to CEO Alex Turnbull, conversions nearly doubled after the redesign. Turnbull says that testing is an important part of his business strategy.
“Design and development are processes, not events. You’re only done when you’re ready to stop growing your business. There’s always more to test and tune.”
Conclusion
You can build the prettiest website of all-time. You can have the best product ever designed. But you cannot know if your website or store are running at the peak of their abilities without some kind of data reference points.
That is where A/B testing comes in. Think of every part of your website that can influence a sale or signup. Is it the best it can be? Can you test that to be certain?
As UX expert Rob Toledo writes, the only way to know, is to know.
“A/B Testing is a powerful and essential approach for any website owner or marketer. After all, it’s just hard to know whether or not all of your efforts are paying off unless you’ve got the experimental data to back it up. As complicated as it may at first sound, mastering this approach will be worth it in conversions, revenue, and happy customers. So read up, pick your variables, and get going!”
- Categories:
- articles
- •
- customer experience