Why Your A/B Tests FailAlways be up to Date subscribe to updates - August 28, 2014
It’s only after collecting and analyzing as much research as possible, and doing some basic hypothesis testing with wireframes, do the agencies get into the actual A/B testing process.
Now, you might not believe it, but there’s a fairly common issue with A/B testing tools causing a “blink” to occur, as the tool decides which variation to show the visitor, which can really skew test results.
This is mostly attributed to slow site speed, or poor test set up, but it’s an important thing to consider. We have seen that these blinks can lower conversions up to 18%.
Much of this has to do with the how the tool serves the variation (is the variation client side or server side?) Certain tools are also vulnerable, allowing competitors to peek at your experiments using this script. Some A/B testing tools that I’m sure don’t have this blink problem are Sitespect, Google Analytics Content Experiments, and Convert.com (ours).