Google Content Experiments: 12 Must-Knows Before Using

Last updated | 12 Comments

content experiments header

Hopefully you’ve considered doing A/B testing on your website to get more sales or leads? You may have heard of Google Content Experiments to help you do this.

But just because it comes under the Google umbrella does it mean it’s any good? And how do you know whether its worth using or not? To help you answer these questions I have put together a list of 12 essential things to know, a short video overview and given it my own mid-term report card.

This tool used to be called Google Website Optimizer, which was pretty good considering it was free. Competition then grew, and back in June 2012 they decided to shut it down, and offer a cut-down version of it in Google Analytics instead. This is what the tool interface now currently looks like:

content experiments screenshot

So let’s get started with the 12 must-knows before you decide to use it to improve your website (or spend much more time with it):

The 12 Must-Knows About Google Content Experiments

  1. GOOD. The tool is very simple and easy to use and setup.
    One of the simplest things first – being part of Google Analytics it benefits from Google Analytics great, easy to use interface. It’s also really simple to create a test (almost too simple, as you will see in my video below).
  2. BAD. Google has not really improved it much since launch.
    It was launched to much excitement and anticipation in June 2012, but since then Google hasn’t done much to improve its limited functionality (see below for more details on improvements). Red-headed step child springs to mind…
  3. GOOD. You can use Google Analytics goals to measure success.
    This is one of the best parts of the integration into Google Analytics – it makes it easy for you to test improving your website goals that you already use in Google Analytics (whether that be sales or something simple like a newsletter sign-up). See the video below to see this in action.
  4. BAD. There is still no visual editor to help you create tests.
    This is a major downfall of the tool, particularly as its main competitors offer a visual editor to make it much easier to create tests. Therefore you are still reliant on tech know-how to create a test page. Surely it can’t be that hard to add this feature? This is an example of the great visual editor in Optimizely:
    optimizely editor
  5. GOOD. You can use Google Analytics segments to analyze results.
    This is one of the main benefits of having this tool being part of Google Analytics – it means its simple to do further detailed analysis on your test results (like how major segments of visitors perform, for example paid search or first time visitors). This gives you great insights to help you create better follow-up tests.
  6. BAD. It only offers split page testing – much too simple.
    This is a big issue with the tool. Unlike its previous incarnation or rival testing tools, it offers no ability to test specific page elements (like a button, image or form) without having to create a whole new page to test against (unless you are an expert who can use their new API functionality). Very frustrating.
  7. GOOD. Running tests with it doesn’t negatively impact your SEO.
    One question I often hear is does this testing tool (and others) affect your SEO efforts? The good news is the answer is no – you can use canonical tags to tell Google not to spider your test variation pages.
  8. BAD. Hard to implement in a flow of pages or on dynamic pages.
    Because you have to create alternative page versions to test, it’s hard to set up tests on pages that show dynamic content or can’t be redirected easily (like on a shopping cart). This limits what you can test and its potential impact. Their new API makes it slightly easier, but you need tech help to do so.
  9. GOOD: It’s perfect for using on websites built with wordpress.
    It’s great if want to test improving a blog or simple website based on a wordpress platform – there is even a plug-in to make it really easy to add necessary tracking code to your pages. But bear in mind if you have a small website, you need enough traffic to get results – at least 1,000 uniques per week.
  10. BAD. It still uses ‘multi-armed bandit’ testing by default.
    Their decision to use this newer testing methodology has been quite controversial – they launched to help you get results quicker, but it doesn’t always give you the best result. So much so that that they recently added an option to change it so your test variations will get equal amounts of traffic – shame they hid this under the advanced options though.
  11. GOOD: Yes it’s still free to use.
    Sure, you can’t beat that its free. But not when you consider there are much better testing tools which cost the less than 50 cents per day (Optimizely has plans that start from just $17 per month). Which smart small business owner or online marketer can’t afford that to help boost their sales?
  12. BAD: You can’t do multivariate testing in it.
    Because its a split page test based tool, you can’t run multivariate tests. This is where you test multiple page elements at the same time to find the best converting combination. While beginners won’t really use this, its essential for any advanced online marketer.

A quick overview of the tool and whether its worth using

To help you understand some of these issues (and benefits) of using Google Content Experiments, check out this 8 minute overview video I created:

What’s been changed since it launched in 2012?

After the early buzz and excitement after launching Google Content Experiments, basically, not a lot has changed unfortunately. There have only been a few announcements – initially they made some early essential ‘fixes’ to satisfy users frustrated with the limited usage options.

Then sadly not much else has been improved – just a few other under-whelming announcements. First, one regarding the Google Analytics Content Experiments API (which is only good if you have technical help available), and then one in Sept 2013 all about a new testing methodology choice and Adsense revenue as a test objective (which seems a bit self serving – only good if you are using Adsense on your site).

Other than that and a few case studies, the sound of crickets spring to mind. A real shame indeed, for a tool that shows great potential.

Google Content Experiments: The Mid-Term Report Card

Okay, so its been a year and half since Google announced they were killing off Google Website Optimizer and moving the remains of it into Google Analytics. They have had plenty of time to improve some of the functionality, which initially seemed to show some great early promise. Unfortunately, they haven’t really done much, and as you can see this tool is still only really suited for real beginners of A/B testing.

Not even the free price point is as much of a selling point anymore because of low cost rivals like Optimizely make this point a bit redundant.

Overall, since it launched, I give Google Contents Experiments the following grades:

  • A for the initial idea and moving it into Google Analytics
  • C for the effort placed into creating it
  • D for functionality and improvements since launch
  • Overall a great initial idea, but must do much better in the future!

I suggest you also learn about the other better tools available to you:
Learn how it compares to its main rivals, Visual Website Optimizer and Optimizely.

Your thoughts on Google Content Experiments?

Have you considered using it or had any success with it? What are you biggest frustrations with it or things you love most about it? Please comment below, and please share this article if you have found it useful. Thanks!

Found this article useful? Please share it on social media. Thanks!