It doesn’t get much better than A/B testing. It’s easy to do, it’s cost effective, and it’s a fun way to experiment with ideas. It almost seems too good to be true. In reality, it might be too good to be true, especially if you’re oversimplifying the process by omitting certain steps and neglecting considerations that make A/B testing so effective in the first place. Even the best testers fail sometimes, and when you examine the situation from an outside perspective, it’s easy to see why.
Here are 5 most common mistakes businesses make when using A/B testing to learn more about the preferences and needs of their target audiences.
1. Cutting your A/B Testing Short
Tests need to be run long enough to collect a worthwhile sample size. Statistical confidence also needs time to become clear. Don’t pull the plug on a test before you’re sure where the largest majority lies. 75% or even 85% seems like a great majority, but you’d be surprised how quickly things can change. While A/B testing is in many ways like a science experiment, it’s harder to control the variables. Make sure your sample size is set to at least 1,000 visitors, and that your conversions are a proportionate improvement with each test.
2.The Test is Too Vague
You’ll probably still hit something when you’re shooting in the dark, but that’s a terrible approach to a situation when the gun is loaded with company money. Everybody wants to know what works to boost conversions. If you’re throwing caution to the wind and trying things just to try them, not only do you run the risk of wasting your entire test, but you also are unable to learn anything. A/B tests should be designed around a specific theory that entails something you can thoroughly measure. There’s no room for abstract experimentation. Even if it works, you may not understand why.
3. Dismissing Failed Tests
You might try a test, find that it fails, and move on to the next thing. That’s the worst mistake you can make. If you find that the first test fails every time, that’s not as unusual as you may believe. After a failed test, it’s time to examine the data. Strengthen your theory, and adjust your concept to play to its strengths. You may need to run three tests, or five tests, or even ten. The variables will get you every time, and people’s attitudes shift from day to day. You’ll never find the consistent medium if you don’t get the full picture that only multiple test runs can provide.
4. Running the Whole Alphabet
There’s a reason why it’s called A/B testing. There’s an option A, and an option B. A grand total of two options is what makes these tests so effective. If you’ve brought other tests in, all you’re doing is running the risk of complicating your tests with false positives. The margin gets slimmer, and your test group is exposed to too much information at once. With every simultaneous test you add, you’re decreasing your confidence margin and jeopardizing your results. If A/B testing alone won’t help you get what you need, try multivariate testing.
5. Calling It Quits
You can and should be testing at all times. It’s great if you’ve found your conversions have improved and your test was successful. What can you do to make your tests even more successful? Analyze what worked with your successful test. Design new hypotheses around your successes. Test with small variations, and keep following your gains down the rabbit hole. Don’t just celebrate your successes – build on them.
In the end, it doesn’t really matter how great your testing tool is – it doesn’t have the mind of a thinker. It’s your job to become analytic-driven and innovative, more than the best tool in the universe.
Using A/B testing to your advantage, make sure to avoid these common mistakes. Otherwise you risk losing time and money on testing which doesn’t bring any valuable insights.
Set your standards high, and strive to get to the bottom of the matter with neatly-structured and well-executed A/B tests. Needless to say, designing a good A/B testing environment is definitely worth your time.
- 9 Aug, 2016
- Posted by Tess Pajaron
- 0 Comments