This week we launched an improved integration for Google Analytics. Besides the automated revenue…
Dirty Details on Google Website Optimizer’s New Big Brother
Just over three weeks ago, Google announced the retirement of their free A/B and multivariate testing tool, Google Website Optimizer (GWO). While sitting in my hotel room in Chicago, right after the speakers drink and the evening before the Conversion Conference starts and I cannot help but wonder what will happen to our conversion industry thanks to Google.
On August 1, 2012, Google will stop all support for Google Website Optimizer. You will not be able to login, all historical tests will be lost, and all currently active tests will be stopped on that day. But not all is lost; Google announced that a limited set of features from GWO will be available in Google Analytics under the section Content – Experiments. I will dive into the differences between the old and new product and give you a plan of action for saving your test data.
Why did Google merge the A/B testing tool into Analytics?
You might have notice that several Google projects have been killed and others have been merged into existing key Google products, such as One Pass and Patent Search. Recently, Google announced that the Google Browser Lab product was merged into Google Analytics. This latest move is in line with simplifying the portfolio.
The incredible value of testing is known; therefore, during summer house-cleaning, Google Website Optimizer was not killed but was cleaned up, got a new UX, and was moved to Google Analytics. I heard from Justin Cutroni, Google’s Analytics’ Advocate, that the product was developed from the ground up with 100% new code and was fit to work well with Google Analytics. They were aware that the old GWO was difficult to set up, and this change will enable more people to test. I agree 100% with Justin on this. Because the core of the new testing tool is now Google Analytics, a large number of benefits come with the integration, but some features have been eliminated.
The Bad: What Google Website Optimizer features will be killed?
A more subtle change that you hardly hear about is the decision to calculate the conversion rate and significance in Analytics on visits, not visitors, as GWO did. This change was based on a technical limitation of Google Analytics (GA) and will mean that test results may be inaccurate if one specific visitor buys multiple times (multiple transactions). Yehoshua Coren (The Analytics Ninja) reminds me of the words that Google uses to explain the difference.
Google Analytics and Website Optimizer have different aims, and as a result, the tools count visitors — and therefore conversions or goals — differently.
A visitor can reach a Google Analytics goal each time the visitor goes to your website. In other words, a single unique visitor who returns to your website over multiple visits and reaches a goal page will record multiple goal conversions in Google Analytics.
Similarly, Google Analytics allows multiple transactions per visit. For example, if a visitor on your website reaches your checkout page multiple times in a given visit, each transaction will be counted.
By contrast, one visitor to your website can only reach a Website Optimizer conversion once per experiment (and only if they get there from your test page). For example, a single unique visitor who returns to the test page on your website over multiple visits and then navigates to your experiment’s conversion page can only record one goal conversion in your experiment. This prevents visitors already familiar with your website from introducing bias in your experiment results.
(Please note: Visitors are unique for each of your experiments. In other words, visitors included in previous experiment results will be counted again in your new experiments.)
Source: Google GWO Support
Yehoshua reminds me again why the old GWO method of measuring conversion by visitors might be better than the way Google Analytics measures conversion, where visit sessions time out after 30 minute of browser inactivity or a closed browser. He commented on this difference: “My take on it is that measuring the impact on conversion is really a pan-session question. Avinash discusses this as well in Web Analytics 2.0. Basically, if a visitor comes to a site, a test can definitely have an impact on the person but the person might not convert that same visit. It is quite common for users to visit a site multiple times before converting. By measuring conversion rate by visitor, the stats more closely model true user behavior. One common goal of CRO is to determine: “will it make Dennis more likely to convert?” Dennis is a person and therefore unique visitors is the metric that most closely models most CRO scenarios.” Therefore, the Google Analytics model of measuring conversion and the success of your test will not incorporate the effect of your test on a future visit (remember, a visit in Google Analytics only lasts 30 minutes). Thus, this small change that does not get a lot of attention should not be overlooked.
The option to automatically prune low performing variations (stopping variations that did not contribute positively to the conversion rate) was also lost during the move to the new GA infrastructure. This means that you need to manually check your tests and turn off low performing variations. Agencies face another change: clients have to give GA access and no MCC account access is available. This small change will not have much of an effect on daily testing because most agencies already have access to GA at the start of a conversion optimization and testing project.
Tests cannot run longer than three months; after three months, they are automatically stopped and no test data are shown before the minimal duration of two weeks. Therefore, high traffic websites that learned fast from tests that ran for a week have to accept this reduced speed of learning or search for an alternative testing solution. Then, the max 12 tests per account and five variations per test is also pushing fanatic testers to search for an alternative to the new version of Google Website Optimizers.
Few details that are currently not supported include copy a test and pause a test. Users only have the option to stop a test but, because this is a big problem for all users, I expect a quick fix after August 1. These exclusions could not have been a technical limitation but most likely were features that did not make the August 1 launch.
In short, the following features are not available in the August 1 Google Analytics version:
- Multivariate testing
- Conversion rate calculation based on visitors (changed to visits)
- Pruning low performing variations
- Access to Google’s My Client Center MCC (agency)
- Testing longer than three months
- Testing more than five variations
- Having more than 12 active tests
- having multiple goals
- Copying a test
- Pausing a test
Even though high traffic sites and marketing experts will miss a lot of features, plenty of testing goodies remained. Let’s look at everything that remains and has been improved.
The Good: What features will stay and what has improved?
Split URL testing or A/B testing based on pages made by the user is still there and looks pretty slick. The automated screenshots are cool and even if the code check is a bit buggy, the most important feature remained. This simple way of making alternative pages on your website can work really well and REGEX and URL variables are both still available. The throttle option to select a percentage of traffic that you want to include in the test is still there and, with these two core features, the new A/B testing in Google Analytics has a solid base.
Rewriting URL for content reports is a new feature that popped up; it allows all variations of URLs to be rewritten for the content reports. This option has pros and cons, but that is why it’s a checkbox. Another checkbox that I found was the notification email, which offers important notification options to GA users on, for example, completed tests. The one-tag solution is excellent. Before, users had a control script, a tracking script, and a conversion script. Now, only one script needs to be installed on the original page being tested. Tracking and goals will be tracked using the Google Analytics script that you most likely have already set up. The integrated goals from Google Analytics are just great. Use any goal you set up in Google Analytics as a test goal and, although engagement goals and time-based goals are not yet supported, this is great progress from the old Google Website Optimizer tagging nightmare.
At first look, reporting seems very similar to the old version. But don’t be fooled that the segmentation option familiar to most Google Analytics users also made it to the testing tool. While you can only select one segment at a time, the reporting function is so powerful that many new insights will come from this new feature. The following is an overview for you on the new and old features that are the now available in Google Analytics.
What do you do to get to all of the new goodies? First, review all of the features in Google Analytics’ new split testing feature:
- Improved: Slick split URL testing for A/B testing
- Improved: Email notifications of tests
- New: One tag solutions for tests
- New: Google Analytics goal integration (one goal)
- New: Advanced Segment Report on test results
- Existing: URL variables and REGEX support
Can you migrate Google Website Optimizer to Google Analytics Split Testing?
Sorry, there is no migration option, but you can salvage the old tests by exporting them before August 1. After this date, all data will be deleted and you will not be able to access your Google Website Optimizer account. The new features in Google Analytics are not activated for all users yet, the first group of users that benefit from the new features are old users of Google Website Optimizer that use the same email for GWO as for Google Analytics. If you do not yet see the Content Experiments option, it will be rolled out over the next week. On August 1, everyone should have this new GA feature.
Impact on the conversion optimization industry
This move will be a hot topic here at the Conversion Conference of Chicago this week and as I discussed here with the Optimizely team. With 60,000 active users of Google Website Optimizer 12 million websites that use Google Analytics, that simple split testing features will now be accessible to an audience that is 200 times larger is a great step. Because this move will have an amazing effect on the conversion rate optimization industry as a whole, perhaps I can follow Bryan Eisenberg’s yearly announcement with my prediction: “2013 will be the year of conversion optimization.” I noticed that since Google’s announcement, around three times more people asked for a demo of our Convert Experiments and the increase in trial users has also been amazing.
Refusing to give in to the fact that multivariate testing is not supported, testing large groups of pages is next to impossible, and revenue tracking is nowhere to be found, check the following three posts on ConversionWorks, where Doug Hall presented the first hacks and creative workarounds:
- Google Content Experiments – testing templates and site-wide content
- Google Content Experiments WITH VIRTUAL PAGEVIEWS!
- Google Content Experiments: A look under the hood
Alternative Testing Solutions
If you are interested in using Google Analytics for your business or are already using it today and want to keep testing but you like one of the features below, then try Convert Experiments, which you will find here on Convert.com:
- Building tests without touching the e-commerce system or CMS (building and serving them at a distance)
- Google Analytics Revenue Tracking for E-commerce integrated in test reports
- Live test stats on all tests
- Unlimited tests and variations
- Multivariate testing
- Engagement of time-based goals
- Multiple goals per test
- Site-wide or product and category page testing without hacks
Loved your Multivariate tagging solution?
We heard from some agencies that they loved the multivariate testing solution of Google Website Optimizer and the way that it used start and stop tags to define sections. We are positive that you will love the simple multivariate setup of Convert Experiments, but if you really want that tagging solution back, we can built it for you. Seriously, if you miss the tagging solution and want a tool that works just like it, we will bring it back if enough of our paying clients write me an email or leave a comment under this blog. We will make it happen. We will code it up just like you were used to, and we will add it as a feature to our product. That way, you can migrate all of your running tests before August 1. We want your agency to be the hero and we will support you in your migration efforts from Google Website Optimizer to Convert Experiments.
Note: Thanx TechWyse for the First Image