Why Your A/B Tests Fail

Why Your A/B Tests Fail

Always be up to Date subscribe to updates - August 28, 2014

Today I wrote a blog on ConversionXL about what we learned from A/B tests performed on the A/B testing suite Convert Experiments. Here a short quote from that article.

It’s only after collecting and analyzing as much research as possible, and doing some basic hypothesis testing with wireframes, do the agencies get into the actual A/B testing process.

Now, you might not believe it, but there’s a fairly common issue with A/B testing tools causing a “blink” to occur, as the tool decides which variation to show the visitor, which can really skew test results.

This is mostly attributed to slow site speed, or poor test set up, but it’s an important thing to consider. We have seen that these blinks can lower conversions up to 18%.

Much of this has to do with the how the tool serves the variation (is the variation client side or server side?) Certain tools are also vulnerable, allowing competitors to peek at your experiments using this script. Some A/B testing tools that I’m sure don’t have this blink problem are SitespectGoogle Analytics Content Experiments, and Convert.com (ours).

Enjoy the read of the full article on ConversionXL, here. 

 

SUBSCRIBE TO OUR NEWSLETTER

Signup to our monthly newsletter to get the best of our content with the latest
Conversion and A/B Testing resources right in your inbox.

  • 28 Aug, 2014
  • Posted by Dennis van der Heijden
  • 1 Tags
  • 0 Comments

Written by Dennis van der Heijden

Co-founder and CEO of Convert.com passionate in building communities that care. Trying to make that happen inside and outside Convert. I love working with my team to make our A/B testing software better for agencies and e-commerce clients.

CATEGORIES Articles Blogs

[hclightbox id='5' text='Anchor text']