Why testing with Google Adwords yields flawed results

Why testing with Google Adwords yields flawed results

google_analytics

Almost since the Adwords program went online savvy marketers realized it had a hidden power… Direct marketing testing could easily be conducted and superior results achieved. In fact, it seems as though Google has embraced this practice, encouraging advertisers to test variations of text ad creatives. With millions of searches per hour, it doesn’t take long to reach a large enough traffic volume to attain high confidence conclusions.

Using Adwords as a “poor man’s” testing tool for landing pages has also been advocated by people who think they know more than they really do. In fact, if you peruse the marketing forums, you’ll often see this discussed, along with haphazard guesses at how much activity denotes statistically valid results.

When you use Google to split traffic between two or more landing pages, you are inadvertently adding a HUGE random noise factor. This WILL screw up your analysis and could cost you MASSIVE amounts of lost sales and advertising dollars. DON’T DO IT.

Basically it goes like this. Google counts a view every time someone clicks on your ad. Even if they come back 4 days in the next ten before buying (and coming back through the same paid link is common.) When this happens guess what it does to your apparent conversion rate. It goes down. In fact, for this guy that converted… he is only showing a 25% Conversion Rate (CR). The next guy buys on the first visit, but from one of your other landing pages. Well guess what, that one just scored a 100% CR for that customer. If this happens a few times, you’ll start thinking that this other creative is better. Bad Idea. Who KNOWS if it really is?

There is no way to be sure, and no amount of volume is going to provide enough confidence to make this test “trustworthy.” The only way to tell for sure is to only count the view once… On the first time a prospect is exposed to your offer. The vast majority of A/B testing software” href=”https://www.convert.com” target=”_blank”>conversion optimization tracking tools out there are flawed- counting all views against all conversions. So be careful. Testing incorrectly is worse than not testing at all- because it takes just as much work, and yields data which will never verify in the bank account.

Originally published June 15, 2014 - Updated April 15, 2019
Dennis van der Heijden
Co-founder and CEO of Convert.com, Dennis is a passionate community builder and out of the box thinker. He spends his time innovating to make Convert Experiences better. Learn about his journey as an entrepreneur and leader on the SaaS Club podcast.
Guest Post Form

We have brought thought leaders, influencers, visionaries and veterans to our tribe. Now it’s your turn. If you have something worthwhile to share with a large community of savvy testers, go ahead and pitch your post idea. We’re listening.

EXPLORE OUR POST GUIDELINES

CATEGORIES Blog

[hclightbox id='5' text='Anchor text']