The Next Generation of Visual Website Optimizer is launching in April 2014 See What's Coming

Top 7 split testing blunders you must avoid

Posted in A/B Split Testing, How To on July 5th, 2011

This is a guest post from Jeremy Reeves, who is a freelance copywriter obsessed with split testing everything from emails, to landing pages, salesletters and upsell processes. You can find more information about Jeremy’s split-testing services.

Split testing sucks!

Let’s face it.

Testing kind of sucks. For most people anyway.

It’s hard to learn… takes a lot of time… and typically requires you to hire a split-testing service provider like me to get the job done right (unless you’re a testing-obsessed fanatic like I am and love doing it!).

But – you also know the financial rewards of it or you wouldn’t be reading this blog :)

And you know that testing can be the single most profitable investment you can make in your business.

But in order to make that happen… you or the person doing split-testing for your business must be aware of the BIGGEST mistakes I see people making when split-testing.

So in this post I’m going to reveal top 7 split-testing blunders your MUST avoid if you want maximum results in your business.

This is a long and information-packed post… so grab a cup of coffee… sit back… and let’s get started.

Testing Blunder #1 – Testing Whispers Instead Of Screams

What are “whispers” and what are “screams”?

Whispers – These are tests such as changing the font, changing the background color, changing the color of a button, testing something in the footer or very low on the page… and basically anything that has no (or very little) chance of getting you a BIG boost in conversions.

(And yes, I realize that “sometimes” some of these things can give decent boosts – but it’s very rare)

Screams – These are tests that are likely to either completely BOMB… or hit a grand-slam home run. They’re the ones that make you anxious to test, but can produce mid double-digit…sometimes triple-digit improvements. Screams are things like getting your copy completely rewritten, testing out a radically different design, testing radically different price points (e.g. $47 vs. $97 instead of $49 vs. $49.95), or coming up with an unheard of and radical guarantee.

Do screams take longer? Yes. Do they require more energy and resources from you? Yes. But is it worth putting money into hiring someone to test radically new copy or a design? Absolutely!

Testing Blunder #2 – Assuming Something Will Work

This is one that 99% of marketers (especially online) are guilty of. They “assume” something will work, so they don’t test it.

Here’s a hint.

Don’t assume anything!

I personally conduct split-tests every single day for my clients, and I’m STILL amazed at just how “weird” some of the tests come out to be.

Think your headline sounds “perfect” and nothing can beat it? I guarantee you – some other headline out there floating around in the ethers of your mind (or your copywriters mind) CAN beat it.

Think your pricing matches your market perfectly? I bet not. You don’t know until you test.

In love with your product name? I bet if you split-tested it using Adwords (which is a neat little trick, and easy to do) – you could find one that converts better. After all, when it comes to testing it’s about money, not ego. Go with whatever gets you the highest conversions.

Split-Testing Blunder #3 – Not Performing A “True” A/B Test

I’m assuming you already have Visual Website Optimizer (if you don’t, you’re nuts)… so I won’t pound this one in too much as you’re probably more familiar with testing than most marketers.

Here’s the mistake most people make.

They think “testing” goes like this.

Step 1 – You watch the conversions of “X” page for 1 week… or for 100 conversions (or whatever other metric).

Step 2 – You then switch that page and replace it with the new test you’re doing… and watch those conversions for 1 week, 100 conversions, etc.

Step 3 – You compare the data, see which had a higher conversion, and you now have a winner!

{Insert obnoxiously loud “eerrrrr” sound}

That’s NOT how you should be testing.

For a test to be accurate, you MUST have all variations running at the exact same time frame, to the same audience.

It’s that simple.

Split-Testing Blunder #4 – Not Running Your Test Long Enough

Here’s another one I see TONS of people make. Mostly, it’s an “ego thing” because they want to brag about the huge increase they got.

Here’s what I mean.

I’ve ran plenty of tests where one variation was the control, hitting 95% confidence, after just a handful of conversions (let’s just say 10, for example). Let’s also assume the losing variation only had 4 conversions.

Does that mean the winner actually beat the control by 250%?

Absolutely not!

I’ve seen tests like this where the losing version came back to beat the one that was winning in the beginning.

Why?

Because you need to run your tests long enough to get accurate results! You need to get past what I call “random coin syndrome”.

What I mean is this.

A regular coin should flip 50/50 over time. But I want you to test this yourself. Go get a coin and flip it 20 times. In most cases, you’ll get very uneven results. Something like 15 heads and only 5 tails. Or 12 tails and 8 heads.

And the SAME thing happens with testing. In the beginning, with under 50 conversions or so (this is a rough number and depends on the conversion number), a lot of the results happen by chance.

My advice is this.

If you’re looking to be accurate by making sure your tests are valid – try to get at least 50 conversions or so before making any decisions – EVEN if you have a winner after 20 conversions.

Split-Testing Blunder #5 – Running Your Test Too Long

Oh yes, there’s a flip side to what I was just talking about!

Running your test for too LONG is a killer of momentum (and sales)

Here’s an example.

Let’s say you’re testing a “whisper” as I mentioned earlier. In this example we’ll say you’re testing your background color.

5 weeks after you launch the test… the variations have only a 2% difference in conversion.

What do you do?

Here’s my advice.

Depending on how much income that product is bringing in, 2% may or may not make a big difference to your bottom line. If it’s NOT, I highly suggest going with whatever is winning at the moment (even if it’s not at 95%+ confidence) and setting up a new test that is likely to give you a bigger bump.

Bottom line: Don’t get addicted to testing results. If a test goes on for longer than normal without a nice increase in conversions, scrap that test and move on.

(The same is also true for lousy products, by the way. Get your emotions out of the equation!)

Split-Testing Blunder #6 – Not Learning From Your Test Results

Learning your market from your tests is CRITICAL to long-term testing success. It will help you dive into the mind of your prospect and give you a “knack” for knowing what they want.

Let me explain.

When I write copy or do split-testing for my clients… the first step I do is RESEARCH. In this phase I basically submerge myself into the middle of the market and try to understand their wants, needs and desires.

I then use what I discovered in my research to write about in the copy, or to create split-tests with.

And while that’s great and works like gangbusters… it works even BETTER when you can do your research based off what people are actually doing with their wallets.

Here are just a few examples of what you can learn about your customers.

Video: If you have several products, consider testing video against text. Test several variations of video/text together (AKA a “hybrid”). If you notice that video wins… you can then roll that test out and test it on all your other products. In “most” cases, you’ll have a winner each time.

Guarantees: Test several guarantees for one of your products. Let’s say you have 5 products and all of them have 30 guarantees. Change the guarantee to 90 days for one of them, and see if you get an increase in conversions (you will). Let’s just say you get a 10% bump. Well… if you then roll that out into the other 5 products, you can count on a 10% bump on each of those as well… giving you a total of 50% in total increased conversions spread out throughout your site.

Copy Styles : You have several options when it comes to copy. Hypey versus soft. Long form versus short form. Professional versus personal. And while you may “think” you know what type of copy works the best, it’s impossible to know until you test it. But here’s a tip to get started. In MOST cases, a mix of hype versus soft… long form… and personal… works the best. However this all depends on your business, product, market and offer… so test it. Once you figure out the perfect formula – use that for the rest of your site!

Note: If you don’t think long form copy works anymore, read this post I wrote about the anatomy of a salesletter after you’re done with this article.

Split-Testing Blunder #7 – Letting Your Ego Get In The Way

Want to know if your ego is getting in the way of your testing results?

Take this litmus test.

Has anyone ever suggested running a test and you said “no, that won’t work”… or “no, my customers won’t like that”… or “no, that will make us look bad”…?

If so, your ego is getting in the way.

Sometimes, that’s fine.

If you aren’t trying to maximize profits and would rather secure a specific mental image in the mind of your prospect (such as a certain design style, etc.), that’s your decision.

A perfect example of this is using “doodles” in your copy. These range anywhere from handwritten notes in the margins… to using a crayon on a piece of direct mail.

MOST people would think they wouldn’t work. That “their customer is too ‘sophisticated’ for that”. Yet a company I know tested several variations of a letter they were sending out… and then tested one that was in the voice of the owners young son… written in crayon.

Keep in mind, this letter was going to a clientele that were “sophisticated”.

So how did it do?

Well, let’s just say… it was the single most profitable letter they EVER sent out in the lifetime of their business :)

Sometimes it’s hard letting go over your ego, I get that. But it’s necessary.

Hopefully this article has helped you ramp up your testing efforts. Or, if you’re hiring someone and having them do it… make sure they understand all of these blunders as well. It can literally make the difference between grand-slams and bankruptcy.

Paras Chopra

CEO and Founder of Wingify by the day, startups, marketing and analytics enthusiast by the afternoon, and a nihilist philosopher/writer by the evening!

The Complete A/B Testing Guide

Know all that you need to get started:

  • What is A/B Testing?
  • Is it compatible with SEO?
  • How to start your first A/B test?
Show Me The Guide


Tags

6 Comments
Travis
July 5, 2011

“Let’s just say you get a 10% bump. Well… if you then roll that out into the other 5 products, you can count on a 10% bump on each of those as well… giving you a total of 50% in total increased conversions spread out throughout your site.”

That’s incorrect. A 10% increase in conversions across all products on your site gives you a total increase of 10%, not 50%. Perhaps #8 on this list should be “incorrectly interpreting results” ;).

Iris Isac
July 6, 2011

Great Post :) and I think I am guilty of Split testing blunder #4 :D

[...] Analysis By: Jeremy Reeves, Visual Website Optimizer [...]

Morten
July 15, 2011

Testing sucks yes, but it is so inportant if you want to have succes

Thanks, very good blog post

Lucy Spence
August 1, 2011

I have to disagree with the last part of point 4. In the majority of cases 50 conversions is still far to small to call a test. By the time you’ve factored in sample size and tolerance you’d have to have an enormous lift to justify calling any test that soon. As a rule of thumb I tend not to look at results until I have several hundred conversion on each variant. Most sites also get different behaviours on weekdays vs weekends so I’d suggest a minimum of a full week before declaring a result (unless something is going horribly wrong and you need to kill it).

Mogens Møller
August 6, 2011

I’m guilty in alot of 1 and 2, a little of 4 and way too much of 7 :-D

Thank you Jeremy and Paras for this fantastic post!

Leave a comment
Required
Required - not published


five − 2 =

Notify me of followup comments via e-mail.

RSS feed for comments on this post. TrackBack URL

I ♥ Split Testing Blog


Stay up to date

Recent Posts

Write for this blog!

We accept high quality articles related to A/B testing and online marketing. Send your proposals to contribute@wingify.com