The Next Generation of Visual Website Optimizer is launching in April 2014 See What's Coming

Seven A/B testing mistakes you need to stop making in 2013

Posted in A/B Split Testing on January 4th, 2013

We’ve survived the annihilation predicted by the Mayans and made it into 2013. Ain’t that absolutely awesome? What isn’t so great is all those testing mistakes you’ve almost certainly made through 2012, mistakes (or bad practices) that were holding back your A/B testing and Conversion Rate Optimization efforts.

For your benefit, here’s a quick recap of the sub-optimal practices you need to let go of, to truly achieve the gains promised by A/B testing.

1) Not calculating your sample size before starting the test

Many marketing folks still don’t calculate the number of visitors needed to run a test before starting the test. As pointed out by Evan Miller in his post “How Not to Run an A/B Test“, you need to decide the required sample size before the test. This ensures that you don’t get bitten by the euphoria (or depression) bug when you see your first statistically significant result and save yourself some bad decision-making.

To illustrate my point, most successful A/B test reports look like the one below.

A/B test reportClick image to see larger version

Notice the fluctuation in the beginning when there wasn’t enough data. It’s quiet likely that on Nov 28, statistical significance was reported but that would lead to an incorrect decision being made.

To avoid this situation, head on over to our Split Testing Duration Calculator, plug-in your values and run your test for the duration suggested by the tool.

2) You listen to your boss or the HiPPO more than the data

Data has truly killed the HiPPO (Highest Paid Person’s Opinion) Star. While age and experience in a particular function do have their own merit, A/B testing is about letting the customer do the talking, without actually talking. So as a business, you need to develop a culture of testing which values real analytics over personal opinion brought on from years of experience.

Data will kill the HiPPO

Experience should guide “the correct way to do it” and not “the exact way to do it”.

3) Disregarding test results and going by what “looks good”

To most of our readers, this might be baffling. After all, why test if you don’t intend to push it live? Unfortunately, it’s a situation we observe far too often. A test is run, a winning variation is found but someone who holds some sway within the organization says it “doesn’t look good”. Lo and behold, the original, losing version of the page stays.

Please leave the money on the table, cos it just don't look nicePlease leave the money on the table, cos that variation just don’t look good.
Image credit: emdot @ Flicker

4) Always expecting large results from small changes

One of the most commonly heard pieces of advice about A/B testing is to go for the small wins, or make small changes that get you big wins. This image we created aptly explains the problem with overusing that bit of advice.

Small wins making you miss the larger picture?

Overly relying on “simple tweaks” instead of undertaking major design changes means you’ll just reach what is called the local maximum. That is, you can tweak all you want but the current design has reached its maximum conversion potential. However, you could achieve much larger increases in conversion rate if you tried a completely new design.

5) Sticking to plain vanilla A/B testing

You are missing out on a lot if you’re not running targeted, personalized tests. When you engage with a visitor, you’re essentially starting a conversation and there’s a different conversation happening with different customer segments. For example, the Smashing Magazine article “The Ultimate Guide to A/B Testing” written by Paras sends a lot of traffic our way. To continue the conversation, we ran a targeted test on the homepage that changed the headline and the sub-headline to closer match where they came from.

Visual Website Optimizer Smashing Magazine test

The result? For the goal being “Visitor signs up for Trial Account”, the targeted messaging is currently out-performing Control by 98.42% with 96% chance to beat original.

6) Not having a well defined Conversion Rate Optimization strategy

We see that a lot of A/B tests are done on hunches and without any planning. Someone is using part of a website and realizes that a particular UI element could do with some tweaking so he/she comes up with a hypothesis, a goal and either thinks up a variation or asks others in the office for their inputs. Based on this, one or two variations are quickly created by the in-house designer and the test goes live.

While this approach may provide small increases in conversion, you’re not truly harnessing the power or CRO. A great Optimization Strategy (PDF) starts with gathering extensive feedback from all stakeholders, pouring those insights into appropriate steps in your sales funnel and testing each step methodically.

7) Not celebrating the way a true win should be celebrated

At the time of a major A/B testing win, have you felt your body, your soul, in fact your entire existence urge you to get up and do the Victory Dance? Well I don’t know about you, but I feel like doing this dance every time one of Visual Website Optimizer tests produces a winner.

Here’s wishing you a great 2013 and loads of Victory Dances!

Follow the very interesting HackerNews discussion here.

Siddharth Deswal

I do marketing at Visual Website Optimizer.

The Complete A/B Testing Guide

Know all that you need to get started:

  • What is A/B Testing?
  • Is it compatible with SEO?
  • How to start your first A/B test?
Show Me The Guide


Tags

6 Comments
Matt
January 4, 2013

Hey that’s my friend Nate in that video!

Nathan Peters
January 5, 2013

You guys have some great content on your site.

I would like to ask your permission to publish your RSS feed on http://www.nichespot.net in the Internet Marketing niche. Nichespot provides dofollow backlinks and rel=canonical protection from the duplicate content penalty. It also has a voting system where the most popular posts remain visible at all times.

Let me know what you think.

Sahil
January 5, 2013

Hey Sid! Thanks a lot for such a nice post on A/B testing mistakes to avoid. I have even seen people who just tweak the designs of their websites a little and telling that none of the versions of the page are getting them the desired results. I hope their doubts are clarified now. They may have to tweak the design completely if it looks awkward rather than just testing a design which has been slightly tweaked and see which one works better for them.

Siddharth Deswal @ Wingify
January 5, 2013

Sahil, thanks for the kind words!

Yup, that’s a fairly common problem – tweaking and tweaking until you hit “frustration” and then concluding “this stuff just doesn’t work”.

Siddharth Deswal @ Wingify
January 5, 2013

@Matt
Seriously? He’s quite the Internet celebrity, and it’s difficult to outdo the dancers he outdoes in this video http://www.youtube.com/watch?v=IixQox4rhEw

Sahil
January 5, 2013

Thanks a lot Sid. Your blog is the one stop site that we refer to know more about A/B Testing. You guys come up with awesome content.

Leave a comment
Required
Required - not published


− one = 6

Notify me of followup comments via e-mail.

RSS feed for comments on this post. TrackBack URL

I ♥ Split Testing Blog


Stay up to date

Recent Posts

Write for this blog!

We accept high quality articles related to A/B testing and online marketing. Send your proposals to contribute@wingify.com