Many times A/B testing is not limited to a single conversion goal. In fact, your test variations usually affect many different conversion goals on your site such as free trial signups, paid signups, newsletter subscription, etc. Measuring all these goals for a test is important because a variation may work brilliantly for one goal (e.g. it increases free signups) but may perform worse for your other goals (e.g. decreases your paid signups). If you don’t measure multiple goals for your A/B test, you are essentially flying blind in the dark and may end up making wrong decisions (based on a single conversion goal).
A brilliant example of this is a recent A/B test by one of our customers: Guido Jansen (a Magento specialist). He runs Dutchento.org which is the official Dutch community for Magento CMS. The goal of his recent A/B test was to increase subscriptions for Dutchento’s newsletter and RSS subscription. The call-to-action for subscription is located in a box on all pages of the site. See the control version of subscription box below and note that there is no explicit incentive for a visitor to subscribe:
Guido tested a variation of this subscription box which included a title and some benefits in a bullet list. Here is his hypothesis for increasing subscriptions:
People don’t just subscribe to a newsletter or newsfeed for nothing, you should convince them it has added value above just visiting the website. So what I wanted to test is if adding convincing reasons to subscribe would increase the newsletter and newsfeed subscription rate. I measured the impact of the convincing reasons on clicks on both the newsletter and newsfeed links.
Here is how his variation (of subscription box) looks like:
As expected, Guido saw a significant improvement of the newsletter click rate (+190.31%). However, newsfeed click rate decreased (-44.46%) which did surprise him (and us!). He expected that the convincing reasons would affect both positively, but apparently it had a negative effect on newsfeed clicks.
The reason why clicks on newsfeed decreased is not clear but we believe that the benefits in variation were so compelling that visitors chose to get the blog updates via email (where they will be sure to read them) rather than RSS reader (where they may miss them). A great way to get more insight into this would be to randomize the position of newsfeed / newsbrief to eliminate the positional effect of those links.
Guido used Visual Website Optimizer for A/B testing and here is what he had to say:
Visual Website Optimizer was very valuable [for testing]. It’s the easiest A/B and Multivariate testing tool I know. It’s great not to be dependent on the development department to create and run your tests.
Whatever be the actual reason of decreased clicks on RSS feed, one key lesson is jumps out of the case study: always measure multiple conversion goals in your test. Relying on a single conversion goal hides a lot of valuable information from you. So, make sure you add multiple goals to your next A/B test.
Do you have more examples of using multiple goals in A/B test? What are your thoughts on this case study? Please let us know by leaving a comment below – will be happy to discuss!
Stay up to date
- Interesting results from our new case study - "A/B Testing Between Free and Paid Signups: Sometimes Free is Better" http://t.co/gmd9jyBFwA #
- @JuliaFok Our $129/mo package that gives 30,000 visitors to test seems like a good fit for you. #
- New post on our blog: Extend Your Conversion Funnel Beyond the "Thank-You" Page http://t.co/rEEtlVrxv9 #
- RT @SantiagoEcomm: @Wingify's #SplitTesting blog is great resource to follow for knowledge & case studies on #ConversionRateOptimisation ht… #
- RT @chinchang457: The new toy @wingify >:) http://t.co/NbGLNZ0G61 #
Write for this blog!
We accept high quality articles related to A/B testing and online marketing. Send your proposals to firstname.lastname@example.org