All ecommerce website owners know how important the add-to-cart button is. They would do anything to have a visitor click on that button because that’s where the actual sales process starts. Because of its importance, many eCommerce retailers start A/B testing variations of that button to improve click through rate. That’s precisely what Trinity Insight, a Visual Website Optimizer customer did for their client: Taylor Gifts.
Trinity Insight is a leading eCommerce consultancy that has helped numerous clients increase conversion rate. (They are also one of our certified agencies). We interviewed Nate Ende of Trinity Insight to talk about the A/B test they recently did to improve add-to-cart clickthrough.
What was the conversion goal of the test?
The premise of this test was to try to improve on the add-to-cart goal of the product page (of TaylorGifts.com).
On which page did you run the test?
The dynamic product page. Here’s an example. (Editor’s note: they used Visual Website Optimizer’s advanced mode to create a test that runs across thousands of dynamic product pages on TaylorGifts.com)
Which part of page did you select for the test and what variations did you test?
Here’s what the original product page looked like:
We ran an A/B split test, however we focused mainly on created a buy box with all of the information relevant to the buying decision located in close proximity of the add to cart action. Here’s the variation we tested:
Why did you think that the variations you created had better chances to beat the original? What were you actually testing in this test?
We felt that presenting this information could help people find the information they needed to make a decision faster and in a more convenient location therefore making them more likely to place the item in their cart.
What results did you get? Were you surprised by the results?
We experienced a 10% lift in the goal conversion on this page and the overall eCommerce conversion rate of the test subjects went from 1.53% on the control to 3.23% on the variation. Needless to say, our client is very happy with the result!
Any lessons which can be derived from your test?
- Placing the standard information we all use to make buying decisions in one easy to scan location makes a lot of sense from a sales standpoint. In traditional retail, ideally, a salesperson would be in close proximity of the item to answer questions about how much it is, and if it’s on sale how much you’re saving. They’d also let you know what other customers thought as so often we use our peers to help us make our decisions.
- Providing valuable eCommerce information near this box may also be a good idea as in how long will it take to get to me and what do I do if I have to return it. More detailed information is great on the page for people looking to make in depth research-based purchases, but the segment of customers who prefer to move quickly through this process will find lots of value in an efficient buy box strategy.
How valuable was Visual Website Optimizer for this test?
Stay up to date
- RT @unbounce: A/B Testing Between Free and Paid Signups: Sometimes Free is Better: http://t.co/MtiATLWOeQ by @wingify #
- Interesting results from our new case study - "A/B Testing Between Free and Paid Signups: Sometimes Free is Better" http://t.co/gmd9jyBFwA #
- @JuliaFok Our $129/mo package that gives 30,000 visitors to test seems like a good fit for you. #
- New post on our blog: Extend Your Conversion Funnel Beyond the "Thank-You" Page http://t.co/rEEtlVrxv9 #
- RT @SantiagoEcomm: @Wingify's #SplitTesting blog is great resource to follow for knowledge & case studies on #ConversionRateOptimisation ht… #
Write for this blog!
We accept high quality articles related to A/B testing and online marketing. Send your proposals to firstname.lastname@example.org