The Next Generation of Visual Website Optimizer is launching in April 2014 See What's Coming

How AMD used A/B testing to achieve 3600% increase in social sharing

Posted in A/B Split Testing, Case Studies on July 2nd, 2012

This case study is released by Advanced Micro Devices (AMD), one of our customers. They sent the following to us and we’re republishing the same on our blog.

A/B Tests Identify Improvements Resulting in up to 36x Increase in Social Sharing on pages

Imagine browsing when you stumble upon the latest and greatest in AMD news. You find this to be so interesting that you want to “ShareThis” with your friends. You’re not alone. Thousands of others want to share content from as well. makes it easy to share content to other sites, via a ShareThis icon/link embedded into the pages of the site. However, after looking at site metrics and how information was being shared to different sites, the AMD Online Marketing team began to question if the appearance of the ShareThis icon/link and existing placement (icons located in the footer of pages) was optimal for users.

To better understand how users interact with these icons and the best possible combination of position and appearance of the icons, AMD decided to use A/B testing. In all, the AMD Online Marketing team created and tested six variations, with different icons and placement, including the control version.

These combinations tested various positions (left, right, bottom) and appearances (icon/link, large chicklets, small chicklets) on one of the more prominently visited and shared sections of the website, Support & Drivers (

The testing was implemented and controlled by Visual Website Optimizer (VWO). All six variations were deployed on the entire subdomain and ran for five days. The test was served to visitors for the first two days, then increased to 35% for the next two days, and then increased to 100% on the last day. The last day also coincided with the launch of ‘a new software driver which typically generates more traffic.

Once a visitor saw a particular variation, they continued to see the same variation to ensure a consistent user experience. The findings proved to be statistically significant, demonstrating up to a 36x increase in social sharing on the tested site as compared to the original site configuration.

Based on this testing, the AMD Online Marketing team has recommended using the left-position chicklet version with dynamic adjustment based on browser window size.

The final result of the testing was in-line with expectations, but the wide range of results from variation testing was unexpected, providing strong support for not only modifying AMD Online Marketing’s ShareThis implementation, but for the continued use of future A/B testing.

Visual Website Optimizer has proven to be a great asset to the AMD Online Marketing team by providing a scalable and user-friendly solution that helps to clearly demonstrate user interaction and behavior. The ability to run optimization tests on a regular basis with minimal effort is critical to AMD’s global online presence and in providing an optimal user experience for visitors.


  • ShareThis icon/link was located in the footer of all pages
  • Need for ShareThis icon/link to be more visible and widely used for increased social media sharing


Tested multiple variations of posting the ShareThis icon/link, including position (left, right, bottom), and appearance (icon/link, large chicklets, small chicklets)


  • The bottom chicklet and link variations proved to be less productive
  • The variations did NOT impact the overall engagement on the pages (average engagement conversion was over 23%)
  • Tested site showed up to 36x increased social sharing as compared to the original site configuration

Siddharth Deswal

I do marketing at Visual Website Optimizer.

The Complete A/B Testing Guide

Know all that you need to get started:

  • What is A/B Testing?
  • Is it compatible with SEO?
  • How to start your first A/B test?
Show Me The Guide


Bianca Chesimard
July 2, 2012

Can we see some of the other placements on their site. Also, how did they measure the interactions separately?

Tony Mariotti
July 3, 2012

Great post! Definitely speaks to the need for testing EVERYTHING…even the small stuff like share buttons. Which apparently have a HUGE impact. :-)

Ahmad Rahman
July 3, 2012

Bianca, we used VWO which allowed us to measure each interaction separately (used custom goals).

The placements were the buttons on the left and right, the same buttons with a promo icon at the bottom, again on the left and right, small icons in the footer, and a text link in the footer.


Tom Andrews
July 6, 2012

Very interesting, thanks for sharing.

Personally find split testing of more use to larger sites, like mentioned above as will have more relevance.

[...] How AMD used A/B testing to achieve 3600% increase in social sharing : via @jeremymarc [...]

July 10, 2012

Perhaps a nice feature you could add to your blog posts? Scrolling up after reading an article is overrated!

July 10, 2012

@Matthew: do you mean we should add a feature like ‘Scroll to Top’ on the blog? Can you please elaborate? Thanks.

July 10, 2012

Writes post about A/B test, only posts the winning screenshot. Fail. How about the other placements?!?

July 11, 2012

This case study is not very helpful, and I think the only reason it was posted is to show that a big name like AMD is using VWO.

I would have done the same if I was you guys and posted this case study, so not blaming you. But this study is almost as useless as me running a study saying “we found an improvement of infiniti after introducing sharing features to a page that did not previously have sharing”.

My point being moving something from the footer to a prominent location will of course bring huge results. I don’t need VWO for that. I like the other posted case studies because they highlight non-obvious changes that can be made that might improve conversions (and can be tested via VWO). But what is there to learn from this case study? Nothing. It’s pure marketing that AMD uses VWO, and no other benefit to the reader.

July 11, 2012

@George: we understand and appreciate your opinion. At different organizations, there is different amount of information they are ready to share. Many large organizations don’t share their testing results, but we’re very grateful to AMD that they could share whatever they have shared. For the record, we reproduced the case study as we got from AMD and did absolutely no changes to it. So if AMD likes VWO, it is because they have found it helpful.

Your point of “something from the footer to a prominent location will of course bring huge results” is not entirely valid. It could very well also have meant that people found the widget intruding and other site metrics (like engagement or pageviews) got down. Moreover, we could never know the exact magnitude of increase in sharing even if we could have guessed that it would increase sharing.

July 11, 2012

Great post, and definitely something to try on my sites. @wingify – I could be wrong, but I think Matthew is suggesting that you add sharing icons to the bottom of VWO’s blog posts. For example, if I want to share this great blog post, I have to think of the idea on my own and then scroll to the top of this page to find the buttons. I think Matthew is onto something. Most people don’t know an article is worth sharing until after they have read it, so having the share buttons at the bottom could have better results. Sounds like another good A/B test!

July 11, 2012

Hey VWO, don’t listen to George, any case study is valuable, and especially one that shows results as large as those.

First off, I agree with George that moving it to a more prominent location will certainly increase conversions. But the power of the case study is being about to see HOW MUCH the sharing increased.

While every website is different, I can be fairly certain that if I were to change the layout of my own site’s sharing, I could estimate how much sharing would change (x36) and how much it would affect my pageviews (not at all). Having educated estimated numbers on how that could change things for my website lets me determine whether it is worthwhile for me to perform a split test like this.

How often do I make content? once a week? How frequently does my current content get shared? 15 times per new article? How many visitors does that lead to? 2 per share? Of those that come to my site through facebook or twitter shares, how many sign up for my RSS feed or email list or request a quote, 1%, 10%, 20%? How much is each quote or each email list subscriber worth to me – $1, $5, $20?

If every email subscriber was worth a dollar to me, and I was getting only 1% of my social traffic to convert, and I was only getting 15 social visitors per article, and I only write 1 article a week, then yeah, this test might not be a priority.

HOWEVER, if I knew that each signup was worth $5, and that my social traffic typically converts at 20%, and I get on average 15 shares an article, then this information is powerful.

In that case, were I to implement a similar test and receive similar results (x36), then that would mean my sharing would go from 15 shares per to 540 shares. All other things remaining constant, that would be 1080 social visitors, worth a dollar a pop (20% * $5). If that was the case, mixing this test with a more frequent content push (moving from 1 to 3 articles a week), could be HUGE for a business.

In conclusion George, this case study gives us a hypothetical number to use for our estimates. Is that number perfect? No, but it does allow us to do a little math, combine it with our own metrics and make strategic decisions based off of what we learn.

Thanks VWO.

July 11, 2012

I was not saying that VWO was not useful to AMD or that VWO is not a tool that can measure changes in conversion.

The point was that there are no actionable insights that can be taken from this case study. I guess it’s a matter of target audience. In my opinion, I bet most people reading this blog are aware that there are A/B testing tools and know that the tools can measure improvements in changes. What they (and I) come to this blog to read are little tricks and tips and insights for things that we could change on our sites that we hadn’t really thought of or fully considered that could improve conversions.

Moving something from an extremely non-prominent location to a much more prominent location is an obvious part of any test, hence I learnt nothing. Hence that is my complaint.

I guess this case study still serves the purpose of teaching absolute newbies that you can use a tool like VWO to measure changes in pages. My hope is that we continue to see more case studies that target the audience group that I fall in.

Thanks for reading.

David Urmann
July 11, 2012

I agree this study is not as helpful. Most of the time the VWO studies are quite helpful. As some other posters suggested it would be great to see the other variations. Given that it was in the footer the 36x improvement is not as impressive as it first sounds.

For the record I am still a big fan of VMO – you can get some great tips here and even if one of them pays off it can mean a lot for your business.

[...] De volgende case study is vrijgegeven door Advanced Micro Devices (AMD) en is uitgevoerd door het team van Visual Website Optimizer. [...]

[...] 14. How AMD Used A/B Testing to Achieve 3600% Increase in Social Sharing [...]

September 14, 2012

Thanks for sharing that insightful numbers on your test results!
It shows -clearly- what right optimisation can do.

Chris Marsh
September 16, 2012

Not a bad case study and insight. Quite odd though to report on this and then leave your own share icons in the footer!?

Siddharth Deswal
September 17, 2012

Hi Chris,

Initially, we had our share buttons only on the top of the posts and a few people complained that when they want to share after reading, they have to scroll up.

We asked around a little and found out that most people like to share only after they’ve gone through the entire post, which is why we added another set of buttons at the end.
June 12, 2013

Good results.. I just can’t understand how this will not correlate with bouncerate or average engagement… This must have had some effect…

Brian McKenzie
August 10, 2013

“We asked around a little and found out that most people like to share only after they’ve gone through the entire post, which is why we added another set of buttons at the end.” I find it a bit counterintuitive that A/B testers would make a decision based on a survey rather than a test/experiment. :)
January 31, 2014

I too wish that the other variations were shown more clearly.

But, even the smallest insights are great food for thought.

Correct me if I’m wrong… but isn’t this test showing that they didn’t have the number of shares on the page – just small share buttons?

I’m wondering if VWO should move to this as well as opposed to showing the numbers of shares?

Thanks for posting this as always.

Leave a comment
Required - not published

× 3 = three

Notify me of followup comments via e-mail.

RSS feed for comments on this post. TrackBack URL

I ♥ Split Testing Blog

Stay up to date

Recent Posts

Write for this blog!

We accept high quality articles related to A/B testing and online marketing. Send your proposals to