The Adwords Campaign Experiments tool (also known as ACE) was originally released as a beta in early June 2010. Now that the tool has been tested and implemented into the Google Adwords interface for some time, we felt it was appropriate to share some best practices on its uses and provide an overview of how it works.

For those of you not already familiar with the tool, it presents an alternative to the traditional PPC Marketing approach of making best practice changes to the account first, then comparing the results (before and after) later on.

How it Works

ACE allows you to make Ad Group level changes to elements of your account (including keywords, bids, Ad Groups, ads and placements) then split test these alongside the originals i.e. the versions before your new changes were made.

You can specify the amount of traffic you want your control group vs. new group to receive (we recommend sticking to the default 50:50 option to keep post-experiment calculations straight-forward), and also the length of time that you want the experiment to run for. In order to achieve statistically significant results you’ll want the experiment to run for at least one month, however smaller accounts may benefit from a longer time period.

At the end of the specified experimental period AdWords will display all of the associated metrics including the degree of statistical significance the results have brought about.

The screen shot below shows where you can set up the campaign experiment – under the Settings tab for a given campaign.

Why use ACE?

In short – to minimise your risk.

Most AdWords accounts do eventually get to the point where the best practice playbook has been exhausted and squeezing out additional profitability (or even maintaining it in the face of tough competition) requires a degree of aptly labelled “experimentation” in sensible doses!

The Results

Depending on your goals for the experiment, some of the metrics provided may excite you more than others. For example, a decrease in impressions while maintaining the same level of clicks is a good result if your aim is to improve the Click Through Rate (CTR).

Within the Adwords interface, ACE uses coloured icons to summarise statistical significance. A grey up-down icon represents that a metric has not changed by a statistically significant amount during the course of the experiment. Upward or downward pointing arrows are used to show that a metric has increased or decreased by a statistically significant amount.

The degree of statistical confidence is broken down further for this group as follows:

  • One arrow pointing upwards or downwards represents a 95% statistical likelihood that the change seen in the metric has not occurred by chance
  • Two arrows pointing upwards or downwards represents a 99% statistical likelihood
  • Three arrows pointing upwards or downwards represents a 99.9% statistical likelihood

In the example screenshot provided below, a statistically significant decrease in impressions has been experienced (represented by the three downward pointing arrows in the impressions column of the experiment row). Conversely the Clicks, CTR, Avg. CPC and Cost columns show grey up-down icons, representing that these metrics have not changed by a statistically significant amount.

Adwords Campaign Experiment screenshot

We’d like to hear about your experiences using the Adwords Campaign Experiments feature. Please post your anecdotes in the comments section below.