Taking Advantage of AdWords Experiments

AdWords Campaign Experiments (ACE) is a great but overlooked feature in AdWords. ACE allows you to test structural changes, bid changes, new keywords, or modify all sorts of things inside your campaigns. All you have to do it define the ratio of the split test, what you want to test, the length of the experiment, and AdWords takes care of the rest!

Introduction

Everyone is always going on about testing and how important it is to always test different aspects of your accounts. If you aren’t testing you’ll feel bad about yourself. Whether it is from the lack of results or the judgement of your peers, you could end up like this. Of course the big one is ad testing. By running structured ad tests you are always evaluating new ad copy. It is also one of the easiest since you only have to make sure you have an even split from the ads and let the users decide.

That is all well and good but what about those situations where you think, “what if I changed the structure of this ad group”. All of this is behind the scenes so you can’t really rely on users to decide which is better as it is more of a systemic issue in the AdWords platform.

Depending on the magnitude of the change, changing your account structure can throw a bit of chaos into your account.  By changing over to a new structure, you risk tanking performance for a little bit as you continue to adjust and optimize.

This is a bit of a commitment and you might not want to fully switch from one structure to the other. This leaves you running two versions of an ad group or campaign simultaneously to compare them. This can get murky in ensuring that traffic is evenly split between the two, forcing you to spend more time trying to separate the signals from the noise.

How Do AdWords Campaign Experiments Help?

Campaign experiments allow you to define your control and variable. The system then splits traffic between the two. Depending on how cautious you are you don’t even have to start testing at 50/50, you could slowly ramp up to 50/50 from 90/10 if it eases the process. Have you been considering segmenting your keywords and ad groups in a different manner? All you have to do is create a new ad group or set of keywords you would like to test and start and experiment.

With ACE, you can now test all these little changes without the hassle of creating everything over again and monitoring yourself. When the test concludes (by your choice or by design) you only have to take a look at the numbers and you can select which version you want to keep.

How to Start an Experiment

To start an experiment, simply go to the settings tab of your campaign. Scroll down towards the bottom to “Advanced Settings” and look for “Experiment”. It will be between “keyword matching options” and “IP Exclusions”, so don’t worry if you don’t see it right away, it is a little bit hidden.

Go ahead and select a name, the experiment split, a start date and an end date.

Screen shot 2013-08-22 at 12.23.52 PM

Once you have created the experiment you will see some new icons in your campaign. These will look like three different colored flasks located in the same menu where you can pause or delete elements of your account.

Screen shot 2013-08-22 at 12.39.00 PM

Green is control and experiment, as in no matter which segment is running this area will be unaffected. This is the option to select for elements that you are not testing or are unrelated to your hypothesis.

The gray flask is control only. Select this for the elements that are part of the test but already exist. Basically this is what you are testing against.

The blue flasks are for the experiment, this is the new additions or reorganizations that you are testing.

Now that you know what everything does, select the new things – keywords, ads, adgroups, bid changes, etc. – and mark them with a blue flask. Mark the originals with the green flask. By default everything else is set to the green flask for control and experiment.

Go back to your settings and start the test, if you opted to start it manually. The test will begin and split the traffic based on the ration you decided. Sit back and let the results roll in.

Getting the Results

To see your results, go to the part of campaign in which you are testing. Segment by “Experiment” and you will have three data rows, “outside experiment”, “control”, and “experiment”. Each row shows the data for the appropriate segment. If you look closely you will see sets of arrows next to each metric. While the tests are running AdWords will calculate for statistical significance for you.

The arrows and coloring will show if the metric is up or down, and to what level the metric is significant – 95%, 99%, or 99.9%. Just another little time saver that ACE gives you, since it can get repetitive to keep going back and measuring on your own.

What to Test

Unfortunately these tests do not work at the campaign level so you can’t test campaign wide settings. That said, anything within the campaign can be tested.

One of my favorites is structure. You can break down big ad groups into smaller ad groups. You could also try reorganizing your keywords into ad groups with new themes.

For Keywords, you could test match types and bids without fully committing to one and hoping for the best. This is a good way to test the scary waters of broad match or overly general terms that may or may not convert.

On the subject of bids, you can find if it is worth it to bid up or down on keywords and see how these affect performance.

Alternatively if you have questions about the value of existing keywords, you can run tests excluding those keywords for a portion of the time and see how it affects the ad group as a whole. Do other keywords pick up the slack or does it tank performance? Only a test will tell.

Conclusion

I hope you take a chance to check out ACE. I know for myself, it is something I don’t use enough. Due to its slightly hidden placement in the interface, it is easy to overlook this tool and forget to use it.

If you have used ACE, what has your experience been like? Have you found anything interesting to test or were you surprised that a small test had a large impact on your account?