Seven Out Of Ten of All A/B Tests Fail

Seven Out Of Ten of All A/B Tests Fail

You did not hear? Seven Out Of Ten of All A/B Tests Fail …  Click To Tweet … Ah then you are not working for Dell or Paypal… since their best optimizer were in the audience when I spoke at the Conversion Conference in Ft. Lauderdale last October. We ran a query on our data and 70% of the A/B test performed by individuals without agencies don’t lead to to any increase in conversion. Click to Tweet Is that a failed A/B test? Not if you learned something from it. But imagine all the money you lose because you have no idea what you are doing?

We created the Convert Academy around this idea and now have almost three dozen of training video’s from amazing experts like: Brian Massey, Tim Ash and Chris Goward.

The dozens of webinars are important to become an expert in conversion optimization. There is a big difference between the experts and agencies and people that set up A/B tests less frequent. The difference is actually 235%

235 reasons to hire a marketing agency for your testing

Analysing 700 experiments in 2012 we found that 33% of our clients that used a marketing agency to setup A/B tests have any increments of conversion rates (7 out of 10 test don’t bring conversion improvements). For clients that did not use a marketing agency this is only 14% (6/7). In short using a marketing agency for A/B testing gives 235% more chance of a conversion increase vs. trying it yourself. Click to tweet

The wins can be significant but triple digit improvements we all like to see in our case studies, but 14% of all the experiments in Convert get an increase of 10% or more where 35% gets 2%-10% lift in conversions.

A/B Testing Is Not For Kids

Why not winning is not failing

I admire the work of Nazli Yuzak of Dell and she is always very open with the experiments they run that did not bring an improvement in conversion. There are learnings inside Dell from these “failed” experiments that were many things that can now be prevented in other parts of the organisation. When Nazli gave a talk at the Conversion Conference last year with Eric Brandt she also said the following:

“There within lies the reasons why many tests fail; an incorrect initial hypothesis. From numerous tests, we’ve found that the hypothesis creation has a major impact on the way a test is run, what is tested, how long a test runs and just as important, who’s being tested?”

Garbage in Garbage Out!

“When you don’t apply a discipline of looking at your testing efforts strategically, identify specific goals that they will serve towards (improving user experience, increase form completion, reduce exit rates, etc.) and think about the expected learning and actions outcomes prior to the test, you would be increasing your chances for failure. Unfortunately, GIGO really applies here – Garbage in Garbage Out!”

Take A/B testing serious and just because we made it easier to test does not mean you should stop thinking what to test, why to test and who to test. Convert Experiments gives the best segmentation and targeting options in A/B testing tools in the industry, try it out for free

Next blog will be about another serious problem Convert Experiments solved “A/B Test Blinking” we have a patent we like to share with you and like to go deeper into the DNS vs Javascript tagged A/B testing tools out there. Like to read a bit of history on blinking A/B software, read this old blog post.


Originally published August 30, 2013 - Updated April 15, 2019
Dennis van der Heijden
Co-founder and CEO of, Dennis is a passionate community builder and out of the box thinker. He spends his time innovating to make Convert Experiences better. Learn about his journey as an entrepreneur and leader on the SaaS Club podcast.
Guest Post Form

We have brought thought leaders, influencers, visionaries and veterans to our tribe. Now it’s your turn. If you have something worthwhile to share with a large community of savvy testers, go ahead and pitch your post idea. We’re listening.



[hclightbox id='5' text='Anchor text']