Beamax, a Belgium based company, manufactures and distributes projection screens for home cinemas and meeting rooms world-wide. They wanted visitors on homepage to go to a site dedicated to ex-demo and one-off items that are sold directly to consumers. (All other screens are sold indirectly through resellers). They admit that it’s a bit odd to drive away visitors to another site from your main page, but they wanted to clear up some space in the warehouse that was taken up by uncommon items.
A/B testing link colors
So, they decided to do a simple A/B test using Visual Website Optimizer. Just above the product images on homepage, they put a standard link promoting the other website. It said:
Great deals on brand new and ex-demo screens here
To increase clickthroughs on link, they tested a red link (with same text) because they felt it would out-perform the standard blue that they use. Plus, it’s something direct marketers use in “real” mail pieces too. As another variation, they transformed link into something banner-like that they thought would even have more impact. Their hypothesis was that the banner version the sure-fire winner. See screenshots below:
What type of link got most clicks?
So, any guesses on which version got maximum clicks: blue, red or banner version? Well, the red link and banner both outperformed the blue link and that wasn’t a surprise. But the eye-opening result was that the red link winning from the banner. The improvement of red link compared to the original blue link was pretty big too: 53.13%. (Note we have some other case studies online which demonstrate how red link outperforms the default link. Here are two examples: PDFProducer case study and GSM.nl case study)
Lessons learned: patience pays
Otto Tromm, CEO of Beamax, stresses the importance of waiting for statistically significant results. He says:
In the early stage of the test, the banner was the big winner. But, over time (when the results got more reliable), the red link outperformed the banner. That taught me not to jump to conclusions.
And it was tempting to declare an early winner, because initial results proved my gut feeling. The test proved me wrong, so it teaches you to stay humble too.
So would I implement a red link vs a banner blindly next time? No, I would test it!
Visual Website Optimizer: how important was it?
Choice of the right tool is certainly very important when are you are doing A/B tests. Beamax chose Visual Website Optimizer for the job (just like thousands of other businesses). Here’s what Otto from Beamax has to say about the tool:
I am not a designer or coder and we use Mod-x and CMS defined templates for nearly all pages. So I neither want to call on experts for every test I do, nor do I want to mess up their work. Visual Website Optimizer made it easy for a non-tech guy to do the tests and keep our designer and programmer focused on their own projects.
With Google’s solution, it was a lot more work to implement tests, which is why I stopped using it. Just couldn’t get it all done myself, which is important when you have an idea and quickly want it implemented.
Hope you liked this case study! If you have any comments or suggestions, we are all ears.
Stay up to date
- RT @unbounce: A/B Testing Between Free and Paid Signups: Sometimes Free is Better: http://t.co/MtiATLWOeQ by @wingify #
- Interesting results from our new case study - "A/B Testing Between Free and Paid Signups: Sometimes Free is Better" http://t.co/gmd9jyBFwA #
- @JuliaFok Our $129/mo package that gives 30,000 visitors to test seems like a good fit for you. #
- New post on our blog: Extend Your Conversion Funnel Beyond the "Thank-You" Page http://t.co/rEEtlVrxv9 #
- RT @SantiagoEcomm: @Wingify's #SplitTesting blog is great resource to follow for knowledge & case studies on #ConversionRateOptimisation ht… #
Write for this blog!
We accept high quality articles related to A/B testing and online marketing. Send your proposals to email@example.com