Why testing with Google Adwords yields flawed results

Why testing with Google Adwords yields flawed results

Always be up to Date subscribe to updates - June 15, 2014

Almost since the Adwords program went online savvy marketers realized it had a hidden power… Direct marketing testing could easily be conducted and superior results achieved. In fact, it seems as though Google has embraced this practice, encouraging advertisers to test variations of text ad creatives. With millions of searches per hour, it doesn’t take long to reach a large enough traffic volume to attain high confidence conclusions.

Using Adwords as a “poor man’s” testing tool for landing pages has also been advocated by people who think they know more than they really do. In fact, if you peruse the marketing forums, you’ll often see this discussed, along with haphazard guesses at how much activity denotes statistically valid results.

When you use Google to split traffic between two or more landing pages, you are inadvertently adding a HUGE random noise factor. This WILL screw up your analysis and could cost you MASSIVE amounts of lost sales and advertising dollars. DON’T DO IT.

Basically it goes like this. Google counts a view every time someone clicks on your ad. Even if they come back 4 days in the next ten before buying (and coming back through the same paid link is common.) When this happens guess what it does to your apparent conversion rate. It goes down. In fact, for this guy that converted… he is only showing a 25% Conversion Rate (CR). The next guy buys on the first visit, but from one of your other landing pages. Well guess what, that one just scored a 100% CR for that customer. If this happens a few times, you’ll start thinking that this other creative is better. Bad Idea. Who KNOWS if it really is?

There is no way to be sure, and no amount of volume is going to provide enough confidence to make this test “trustworthy.” The only way to tell for sure is to only count the view once… On the first time a prospect is exposed to your offer. The vast majority of A/B testing software” href=”http://www.convert.com” target=”_blank”>conversion optimization tracking tools out there are flawed- counting all views against all conversions. So be careful. Testing incorrectly is worse than not testing at all- because it takes just as much work, and yields data which will never verify in the bank account.

SUBSCRIBE TO OUR NEWSLETTER

Signup to our monthly newsletter to get the best of our content with the latest
Conversion and A/B Testing resources right in your inbox.

  • 15 Jun, 2014
  • Posted by Dennis van der Heijden
  • 1 Tags
  • 0 Comments

Written by Dennis van der Heijden

Co-founder and CEO of Convert.com passionate in building communities that care. Trying to make that happen inside and outside Convert. I love working with my team to make our A/B testing software better for agencies and e-commerce clients.

CATEGORIES Articles Blogs

[hclightbox id='5' text='Anchor text']