The Difference Between "We Think" and "We Know"
Colors. Shapes. Placement. You don't have to be a marketer to know that presentation and layout makes all the difference. But what is the best layout? It depends, lets look at the data.
What is A/B testing?
A/B testing (also called split testing) is taking two variations of the same webpage and testing which one performs the best. Either of these two variations of the webpage will be shown to people at random. Results will be recorded for you to determine which one to use. One of the variations will be the control and the other the experimental version. The data is collected in an analytics dashboard and analyzed for you. By looking at the data, you can determine if the change of webpage made a difference in user interactions.
Why use it?
You may feel that you know whats best for your target market. Or maybe just because you like something, you think it will work. Sorry, but the data doesn't lie. Technology has far surpassed our ability to make decisions no matter how smart you think you are. Making a decision without any data behind it is setting your company up for failure. To avoid it, use data to come to an answer. Should we use a red button? Or a blue button? Should the banner be at the top of the page? Or the bottom? These are all questions that can be solved using A/B testing.
I'll prove it to you.
Do you guys remember the video game The Sims? The one where we did everyday things, but in video game version. I'll admit it. I played for bit. I had a nice job, a house, and a couple of dogs. I had a lot going for me! EA games is a successful example of A/B testing. Specifically for their 2013 game SimCity 5 which sold 1.1 million copies in the first two weeks of it's launch. 50% of its downloads were digital all thanks to A/B testing. The original layout looked like this.
Before
After
You might think that having the promotion at the top of the page would drive purchases but this was not the case for EA games. Instead A/B testing gave them the answer to maximize their revenue. Their variation and final project looked like the image above. Notice the promotion at the top is gone.
Another success story is Upworthy, a website for viral content. Their overall goal is to boost engagement and social sharing of their content. As Upworthy grew, the team realized the site had a strong emphasis on social sharing but did not have an obvious way for users to get to the next step of sharing content. Their original page looked like this.
Before
After
Upworthy tested for just a few days and quickly found out that by using the top performing recommended content module they were able to increase sharing by 28%. By listening to the users, Upworthy was able to dramatically improve the overall site engagement.
These are just a few of the many success stories. If you are interested in A/B testing here are a few tips for you during the process.
Best Practices
Come from an unbiased angle: This is a true experiment and should be treated like one. Coming in with preconceptions of what customers may want will condition the experiment.
Test one change at a time: Testing more than two changes at a time will not clearly tell you which change is working the best.
Heat mapping: Use heat mapping to see where on the page gets the most traffic. This can be helpful when creating your experimental page.
Focus on your users: You are doing this for your users so they will purchase your product or service. You can use A/B testing to influence your users. Happy customers make purchases.
Happy users turn into happy customers which drives success to make you happy!
Use A/B Testing to officially turn your "I think" into "I know"