The Next Generation of Visual Website Optimizer is launching in April 2014 See What's Coming

Lean Newsletters: How HelpScout increased clickthrough rate by 17%

Posted in A/B Split Testing on January 30th, 2013

When it comes to getting your message heard, optimizing your email marketing efforts to improve conversions is of paramount importance.

Below, I’m going to discuss a simple tactic that I’ve recently used across 3 separate industries to improve click-through rates by as much as 17%.

(Yes, you read that correctly)

It’s founded on a simple psychological principle put forward by a professor at Columbia Business School, and it deals with the human mind’s inability to make decisions when we have too many choices, causing us to have action paralysis.

Let’s dig in!

When Choice is Demotivating

Before I get into my specific results, I wanted to discuss what’s happening behind the scenes with these improvements.

Sheena Iyengar is a professor and psychological researcher at the Columbia Business School and the author of The Art of Choosing. In her most famous study called When choice is demotivating, Sheena conducted her famous “jam test” at an upscale supermarket.

In this study, Sheena compared customer interaction and actual purchases between two different types of jam displays:

  1. The first display featured 24 flavors of jam
  2. The second display featured only 6 flavors of jam

The results?

While the 24 jam display had a significantly higher “interaction” rate (more customers taking samples), it only had around 3% of customers actually making a purchase.

This is in start contrast to the 6 jam display, which had over 30% of people make a purchase!

The conclusion she drew from this research is that too many options can cause people to choose nothing instead, and that a plethora choices can actually be demotivating for customers rather than empowering.

What This Has to do With Email Marketing

I mention this study and it’s conclusions because they apply directly to how I’ve improved click-through rates for 3 separate newsletters that I work with.

My first test began a newsletter for a music site called Sophistefunk, where I was getting decent rates but nothing to get excited about:

(Screenshot from AWeber, total list size is low 5-figures)

At the time, I was using a “blog broadcast” that sent out new posts in a clump of 5-6.

It was minimally styled and updated via my RSS, you can check out a screenshot of one such broadcast below:

I thought this would be ideal for my subscribers, but I made the same fundamental error revealed in Sheena Iyengar’s study: I assumed more content would create more engagement.

Another newsletter of mine called Sparring Mind didn’t have these problems, and I surmised that this was because I only sent out emails with one action to take.

To overcome this, I tested sending out emails with only one call-to-action for my Sophistefunk newsletter, and I saw click-through rates increase dramatically, along with overall engagement!

Here are the results of the two latest broadcasts following this format:

(AWeber screen shot is below)

As you can see, click-through rates increased dramatically, and with that came better open rates and my subscribers got used to the new mailing style. It wasn’t overnight, but I’ve seen far better engagement with my list now that I only send one thing at a time.

Subscribers aren’t so willing to pass over an email if they know there will be only one action to take. They trust you to not send them too much, whereas when you send huge newsletter feeds, they will ignore it, even if they are just slightly busy (and who isn’t these days?).

It’s been discussed (many times before) how distractions can hurt conversions when it comes to web design, but from my testing, this appears to apply to many email newsletters as well: while fully featured emails (like Amazon’s) may work for some businesses, many content-centric newsletters can benefit from eliminating distractions.

Stylized vs. Plain Template

I mentioned above, I had also changed my stylized HTML emails into plain text emails, so you might be wondering if that was a confound to my results.

It turns out that HTML styling likely has a far less overall impact on click-through rates than you may think, as we recently adjusted our email newsletter at Help Scout to include only a single blog post at a time, rather than the three that we used to send.

Here is what our old newsletter looked like:

As mentioned, it included 3 different pieces of content, and although it was well designed and had some enticing article headlines, it wasn’t performing as well as it could from a content engagement standpoint (it still maintained a strong open rate).

In our most recent change, we simplified our content newsletter and cut out the other two pieces of content, including only the newest blog post:

Despite the fact that they were still both stylized, we saw nearly a double-digit percentage increase in click-through rates, and you can see below what an impact it had on our traffic from our email marketing efforts:

(Screenshot was from the first week in January, looks like it will be our best month yet when it’s all said and done).

This effect has managed to stay consistent over time. We haven’t had a significant drop in click-throughs for any recent newsletter, and we’re still maintaining the high open rates that we gained once we switched to this “single-serve” variety.

All in all, it was a pretty big change but resulted in huge improvements for all the newsletters I manage.

I will still keep on testing in the future, but I definitely advise any marketer who manages their startups newsletter to try this streamlined form and see if they get similar results.

Your Turn

Do you have any other interesting newsletter tests that you’ve run?

Share them with us in the comments!

Gregory Ciotti

Gregory Ciotti is the marketing guy at Help Scout, the invisible email support software for startups and small-business owners. Get more from Greg on the Help Scout blog.

The Complete A/B Testing Guide

Know all that you need to get started:

  • What is A/B Testing?
  • Is it compatible with SEO?
  • How to start your first A/B test?
Show Me The Guide


Tags

3 Comments
Chris Hexton
February 1, 2013

Nicely done Gregory, another example of ‘single call to action’ at it’s finest!

Elizabeth Yin
February 2, 2013

Love the idea…lots of questions about the analyses.

1) Are these split tests? Or just comparisons from one campaign to the next? I would hope you did true split tests on the SAME CONTENT given this is posted on the VWO site, but that doesn’t seem clear… Content differences are enough to add crazy variables to your results making it difficult to know if your changes were truly the cause of the new results.

2) While your CTR doubled, your open rate doubled too. What is your definition of CTR? Is it on opens? Or on the aggregate # of sends. Depending on your definition, this implies that if people open it, they aren’t really more likely to click based on the “single serving” layout. The real question is why did the “single serving” layout increase his open rate? The possibilities include the following:

A. It didn’t. There was another change (subject line change, list collection, whatever) that increased the open rate. Then the click rate increased proportionally. (Both numbers doubled)

B. The “single serving” layout got the email newsletter fished out of spam. So, over time, opens increased (but clickiness didn’t actually increase). Open rate analysis by domains would help here.

C. People don’t display images so they couldn’t get counted in opens if they didn’t click. If this was true, then yes, clickiness really did increase. However, I don’t think this is possible because then you should see equal percentage increases in the open and click rates but that is not true. (+12 absolute percentage point click rate was not paired with +12 absolute percentage point open rate) (Aweber uses click rate defined as clickers over total sent, like MailChimp does.)

Gregory Ciotti
February 2, 2013

@Chris — Thanks man! Glad you enjoyed it.

@Elizabeth — Good questions, let me address them.

1.) Split-tests were conducted for *my* emails (the ones for Sophistefunk), but not done for Help Scout.

2.) This is a great question, so let me break it down how this went with Sophistefunk…

Since it’s a daily-updated blog, the newsletter I send out has the EXACT same subject line each and every time, “New music on Sophistefunk!”

That said, the subject line couldn’t be the culprit for the open-rate.

The argument for images or possibly the spam filter might be true, but considering the gradual improvement I saw over time (and the really high rates I’ve seen for Sparring Mind), my hypothesis was that these “single serving” broadcasts became more welcome in people’s inboxes.

Running a test on open rates domains is definitely something I have lined up next, for now I’m just going to enjoy the performance improvement. ;)

Leave a comment
Required
Required - not published


six − = 3

Notify me of followup comments via e-mail.

RSS feed for comments on this post. TrackBack URL

I ♥ Split Testing Blog


Stay up to date

Recent Posts

Write for this blog!

We accept high quality articles related to A/B testing and online marketing. Send your proposals to contribute@wingify.com