Advanced A/B Experiments

GWO provides for an A/B style of testing “out of the box”. However, sometimes you may find it does not quite suit your needs, or, you may need more control over the URL to which visitors get redirected. This article describes how to perform an A/B test where you have much more control over how the redirection to the alternative pages takes place.

The following is a technique for performing an A/B test such that you have an opportunity to dynamically participate in the construction of the alternative URL’s.

First, instead of creating an A/B experiment, create a Multi-Variate experiment. Place the control script at the top of your test (A) page. Place the test page tracking script at the bottom of your test page and alternate pages (B, C, etc). Place the goal tracking script at the bottom of your goal page.

Now, instead of introducing the multi-variate sections scripts, place the following somewhere in your test page (I recommend it go near the control script):

<!– utmx section name=”page-url” –>

This HTML comment is very much like a section script in that it declares a section named “page-url”, but does not modify the page or the user’s experience at all.

Now, you can validate the test and goal pages and move on to specifying the alternate URLs which you want to test. Enter these into the GWO UI as the content of variations for the page-url section. Here I’ve specified a simple relative URL, but you can specify a complete URL if you want:

Before launching or previewing your experiment, place the following script immediately after the control script on your test (A) page:

function filter(v) {
var b = utmx('variation_content', 'page-url');
var u = v[0].contents;
if (b && u.substr(0,7) == 'https://' && b.substr(0, 7) != 'https://') {
u = u.substr(7);
return u;
utmx('url', 'page-url', 0, filter);

At this point, you can preview and launch the experiment. It will behave just like any other A/B experiment.

You can see an example of this in action here as follows.

Test (A) Page (inactive)

Note that that URL will take you to the A page without enrolling you in the experiment. In order to experience a possible redirect, remove the content after the # in the link’s URL:

Test (A) Page (active)


The Filter Function

This script above will perform the redirection to alternative pages, should GWO decide that a given visitor is not to see the A page. Let’s look at a bit more closely.

function filter(v) {
utmx('url', 'page-url', 0, filter);

The utmx function is defined by the control script. It is the main entry point for a variety of GWO functionality available in test pages. In this case, the first argument, ‘url’ tells the function that it should treat this experiment as an A/B experiment and perform a redirect if necessary. The second argument, ‘page-url’, is the name of the section which defines the alternative URL’s. The third argument is a positional indicator and should be set to zero in this case. Otherwise, I will not describe it here.

The fourth argument is a filter function which you define and is called just before redirection takes place. It takes, as an argument, an object containing the redirection URL computed by the utmx function and returns the actual URL to which the user will be redirected. It is your opportunity to get involved in the form of the URL the visitor is redirected.

Before calling the filter function, the utmx function does a number of things to the target URL. First, it merges all query parameters of the current URL (document.location) with the query parameters of the target URL. This allows an alternate page to have the same information the A page has. For example, you might encode product ID’s as a query param:

You might enter:

As the alternative B-page URL for your experiment. Because you are testing all your product pages, you can only specify the B-page URL, sans the product. GWO will redirect to:

Which allows your B-page to know which product is being queried and present that product in the context of the B-page.

The utmx(‘url’, …) function also looks at the URL and adds https:// to it if it does not already have it. Many times, this is fine, if you don’t specify the protocol. For example:

But it can sometimes get in the way. For example, you might want to specify (as I do in my example above), a simple, relative URL for the alternate pages:


The the code in the custom script above will strip this away as needed:

var b = utmx('variation_content', 'page-url');
var u = v[0].contents;
if (b && u.substr(0,7) == 'https://' && b.substr(0, 7) != 'https://') {
u = u.substr(7);
return u;

If the “raw” version of the alternative in the variable ‘b’ does not begin with https:// but utmx’s version does, then it will be stripped away. Finally, the URL in the variable u is returned where GWO will perform a redirection to it.

URL Customization

The filter function allows you to inspect and modify the redirection URL at will. To demonstrate this further, consider my example above where the product ID is a query parameter:

But, what if my site encodes the product in the path of the URL? Like so:

And, I want to test an alternative page, like so?

You’d might enter the alternative URL in GWO as:

But, because your test page will be called for more than one product, like:

You can’t enter that URL, otherwise all users will see only the tofu product, regardless of which product they may have clicked on. Or, you might enter:

But, no product is specified here, and your web server might produce an error page.

What you need to do in cases like this is write some custom JavaScript which builds the correct URL. So, building off the last example, consider the following example. Let’s say the the following URLS are two of among many products:

And you specify an alternative URL like so:

The idea is while computing the URL to which a redirection will take place, inspect the current URL (document.location.href) for the name of the product, and replace the word PRODUCT with the name of the current product in the a-page. Like so:

var b = utmx('variation_content', 'page-url');
function filter(v) {
  var u = v[0].contents;
  if (b && u.substr(0,7) == 'https://' && b.substr(0, 7) != 'https://') {
    u = u.substr(7);

  var l = document.location.href;
  var prefix = '';
  var i = l.indexOf(prefix);
  var j = l.indexOf('/', i + prefix.length);
  u = u.replace('PRODUCT', l.substring(i + prefix.length, j));
  return u;
utmx('url', 'page-url', 0, filter);

This is very much like the first example above, but instead of simply returning the URL, we get the current product name and use it to replace the place-holder token, “PRODUCT” which is present in all alternative URL’s. This allows us to redirect to the proper alternative URL, while preserving the current product the visitor is interested in.

You can see this in action here:

Happy redirecting!




This is awesome stuff Eric thanks for this, it will help alot with some of those advanced problems.

This is great thanks Eric. One question, do you have to alter the conversion script in any way? Our test is tracking visitors no problem, however conversions seem to be another issue.

Great stuff Eric, thanks. One question, do you have to alter the conversion script as well? We are tracking visitors with no problems, yet conversions is another story.

Hi Began,

No, the conversion scripts need not be chnaged using this technique. All this does is change the way that the redirection takes place.

– Eric

Hi Eric,

Great article! I had a question regarding the way GWO handles the URL defined in an AB test.

I have a test running with 1 variation. The scripts are all in place and I’m seeing conversions but the visitor count is not counting on the variation.

I have the URL for the variation defined correctly in the tool but once the user hits that page, it passes them to different pages depending on their login status, and then to a third page (I have the tracking script defined on that page).

It is important that the user hits the main page that handles the redirect. But it doesn’t seem to register as visits.

Is that because the tool has to match the URL to the page triggering the script. And if so why am I seeing conversions? Is it registering the trackPageview but not counting as a visit.
Is there any way around this?

Any help would be appreciated

[email protected]

Really wonderful stuff here Eric. In one experiment I’m now testing thousands of templated pages versus a new version of the template. It’s working perfectly

Thanks for your great trick, Eric. It’s working flawlessly in FF and Chrome. Though I seem to have poblems with IE – it simply won’t redirect to alternative urls. Have cleaned my cookies and Ctrl+F5-ed for about 30 times – still nothing. This seems to be happening on all machines using using IE8 | Vista. I’ve tried it with your examples, as well ( and and they don’t redirect either.
The redirects work fine for my colleagues who use IE8 on XP and IE7 on Vista.
Any ideas? Could it potentially mess up the GWO results?
Thanks in advance.

Hi Cristina,

I have the same experience you do with IE8 under Vista. However, if you clear cookies *and* close IE, and then go back to IE and visit your experiment, I find that it can redirect to an alternate page. IE seems to remember stuff in a browser session.

– Eric

Eric, thanx a lot! I want to test product pages so I guess this will help me a lot!

I have one question. What to do if even a Goal page has URL like

Thank you very much in advance!
Best Regards,


Just put the goal tracking script on all your goal pages. Only those who entered the experiment and get to a goal will trigger a conversion.

– Eric

Where does the Control Script belong?

I frequently see test pages in which the control script is not placed in a good location. In this article, I want to talk about the things to consider when placing the control script into your test pages.



The presence of the control script in your page will introduce latency into the total load time of the page. When the control script executes, it generates a request for a Google resource called siteopt.js. The latency is attributed to the time it takes for siteopt.js to load. To demonstrate this, with Firefox, you can load siteopt.js in the presence of the Firebug add-on that can measure the amount of time that it takes the page to fetch various resources. For me, inside the Google corporate network, it takes on average about 20 milliseconds to load siteopt.js:

When I do the same thing from my home, it takes a little more time, about 36 milliseconds (I use a microwave based ISP, which adds a little bit of latency to everything):

In order to minimize this latency, Google distributes the servers that respond to siteopt.js requests all over the globe. This way, visitors from Mongolia to your test page don’t have to load siteopt.js from a faraway server in the United States, they will probably get siteopt.js from a server in Asia, or Northern Europe.



If you are running an A/B experiment, the control script may cause a redirection to happen if Google decides that this particular visitor should see a page other than the A page. This means that all the processing that the browser is doing when the redirect takes place will be aborted when the new page is loaded.

Other Resources

Given these aspects of the control script, it is very important that the control script appear before any references to external resources. These include CSS, script, image, objects, and the like. The reason for this is that if the control script decides to perform a redirect, all the time and work involved in loading these resources will be wasted and, most likely, performed again in the target of the redirect. This leads to increasing the total latency that the visitor experiences.

Displayable Content

Because the control script loads alternative content used in the display of the page, it needs to appear before the points in the page that potentially use this alternative content. Additionally, it is very important that the control script appear before any content in the page that is displayed to the user.

The reason for this is, again, latency. If the control script were to appear after, say, the first paragraph of the page, the user would see that paragraph, experience a very brief latency, and then the rest of the page would display. However, if the control script were to appear before this paragraph, then the window remains blank during the small latency, and then the page would render as a whole. This is a better experience for the user.

Also, a browser may spend less time laying out the page because there is no interruption of the display of the page.

Document Type Declaration

Many pages have a document type declaration. It may look something like this:


Browsers will change the way they parse an HTML file based on this declaration. In order to determine the type of a page, browsers will “sniff” for this declaration at the very beginning of the page. If they find a well formed declaration, then the parser for that document type will be instantiated.

It is very important that the control script appear after any document type declaration. The reason for this is that browsers will only look so far into an HTML document when sniffing for these declarations. The presence of the control script before the declaration may cause the browser to not find the declaration and to choose the wrong parser. This can have devastating effects on a page, potentially rendering it unusable.


So, in a well formed HTML document, the control script should be:

  • After any document type declaration
  • Before any other resources (CSS, scripts, etc)
  • Before any displayable content (text, tables, etc)

In a well formed document, these restrictions are usually accommodated by placing the control script as the very first element of the head element, just after the beginning <HEAD> tag.


Nice, totally agree Eric, I always recommend placing the tag just inside the head tag. However as you mention IE can trigger quirks mode if anything appears above the doctype so you need to place it after doctype and ideally before anything in the head section.

Thanks for explaining why the control script should as close to the top of the page as possible. Putting the control script in the head of our HTML would be somewhat difficult for me, because the web framework I’m using uses “layouts”, which make it difficult to customize the head of a page.

The other thing is that we try to push the loading of external scripts to the bottom of the page, which prevents blocking while external scripts are loaded. One way to achieve this might be to have your page sections consist of JavaScript at the bottom of your page that replaces the original content in the middle of the page with alternate content. This adds a bit of indirection to your page, but I think it would work.

Poor Man’s GWO/Analytics Integration

A while back, ROI Revolution (a Google Website Optimizer Authorized Consultant) came up with a technique for importing GWO information into GA (Google Analytics). This way, you can see how different variations of a web page/site will affect the various numbers that GA measures.

The original technique relied on munging through the __utmx cookie that the control script sets. I improved upon this by describing a technique which uses some new functionality I incorporated into siteopt.js. Their most recent description of this technique is here.

I’ve been giving some thought to this technique and wanted to 1) Improve the technique a bit, and 2) Suggest an alternative which has some significant advantages.

First, an look at my script for the ROI technique. The script I suggested is (note, all these scripts assume that ga.js and the control script have already been included on the page):

<script type="text/javascript">
if (utmx('combination') != undefined) {
var l = document.location, s =;
s = s + (s.length ? '&' : '?') + 'combination=' + utmx('combination');
var pageTracker = _gat._getTracker("UA-XXXXXX-Y");
pageTracker._trackPageview(l.pathname + s);

This suffers from two things. First, global variables are introduced into the namespace of the page and, second, there is no specification of which experiment is annotating the hits. So, consider the following, modified, script:

<script type="text/javascript">
(function(){try {
if (utmx('combination') != undefined) {
var l = document.location, s =;
s = s + (s.length ? '&' : '?') + 'combination=' + utmx('combination');
s += '&experiment=MyExperiment';
var pageTracker = _gat._getTracker("UA-XXXXXX-Y");
pageTracker._trackPageview(l.pathname + s);

Here I’ve enclosed the script in an anonymous function and a block to catch exceptions (should ga.js fail to load), and also added a query param describing the experiment being tested to disambiguate one experiment from another.

With this technique, you can go into your content report in GA and see how many visitors assigned to the various experiment variations actually end up visiting pages on your site.

This technique requires you to modify the GA tracking script on each page of your site which you want to track against an experiment. If you’ve already instrumented your whole site with GA tags, it might be cumbersome to edit them all, especially when you’ll need to modify them again when you end and experiment, or start a new one.

Experimenting on the entire Site

The answer I’ve come up with to deal with these issues is to integrate GWO experiment information a different way. Instead of modifying each page URL with the experiment information, you can set the GA user variable with experiment information (assuming you are not already using the variable for other purposes). Consider the following script:

<script type="text/javascript">
try {
var pageTracker = _gat._getTracker("UA-XXXXXX-Y");
pageTracker._setVar("MyExperiment-" + utmx('combination'));
} catch(err) {}

First note that this script does not call _trackPageView. Its job is not to track anything, but to set up the user variable to contain information about an experiment. This variable is stored in the __utmv cookie, and every subsequent call to _trackPageView in a tracking script will communicate the value of this cookie to Google servers and annotate the page view with the value in the cookie.

This script needs to placed in a particular place in your pages. First, it must only be located in GWO test pages which have already executed the control script. The script uses the function utmx, which is defined by siteopt.js, which is obtained through the control script. Next, this script needs to be executed before any calls to _trackPageView in any other GA tracking scripts. It also needs to be after the inclusion of ga.js.

I recommend that you place this script after the ga.js inclusion script and before the GA tracking script in the following manner:

<script type="text/javascript">
var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "https://www.");
document.write(unescape("%3Cscript src='" + gaJsHost + "' type='text/javascript'%3E%3C/script%3E"));
<script type=”text/javascript”>
try {
var pageTracker = _gat._getTracker(“UA-XXXXXX-Y”);
pageTracker._setVar(“MyExperiment-” + utmx(‘combination’));
} catch(err) {}</script>
<script type=”text/javascript”>
try {
var pageTracker = _gat._getTracker(“UA-XXXXXX-Y”);
} catch(err) {}</script>

This way, the variable gets set before the tracking script which will use the variable annotate this page visit. Furthermore any visits that this visitor makes in the future (for up to an absence of two years) will also be annotated.

You can see this in action here:

Test Page


Special Consideration for A/B Experiments

The technique described above works for multi-variate experiments. That is, experiments which do not use redirection to present different pages to visitors. If you were to use this technique on GWO A/B experiment, the code which sets the user defined variable would sometimes not get executed because the original page would redirect to an alternate page, skip the setting on the original page for one which does not have the Control Script, and can’t use the “utmx” function.

Now, there are is a way to have the Control Script on A/B alternate pages. I’ll save that technique for another article. However, there is a way to get the above technique to work A/B (redirection) experiments. The idea is to set the user defined variable before the redirection on the original page takes place. To do this, consider the following A/B Control Script:

function utmx_section(){}function utmx(){}
(function(){var k='3923492669',d=document,l=d.location,c=d.cookie;function f(n){
if(c){var i=c.indexOf(n+'=');if(i>-1){var j=c.indexOf(';',i);return c.substring(i+n.
length+1,j<0?c.length:j)}}}var x=f('__utmx'),xx=f('__utmxx'),h=l.hash;
d.write('<sc'+'ript src="'+
+new Date().valueOf()+(h?'&utmxhash='+escape(h.substr(1)):'')+
'" type="text/javascript" charset="utf-8"></sc'+'ript>')})();

Notice that this is really two scripts. The first script is that which, among other things, defines the “utmx” function. The second, using the “utmx” function, is the one responsible for performing the redirect to an alternative page, should it be required.

It is between these two scripts where you will need to include ga.js and set the user defined variable in order for this technique to work for A/B experiments. So in the A/B case, move that code from just before the tracking script to be between these scripts.

Experiment without GWO reports

When you normally set up a GWO experiment, you are asked to place test and goal page tracking scripts on your test page and goal page. This allows GWO to track when visitors hit your test and goal pages, and render a GWO report.

If you only plan to look at the GA reports for this experiment, you can forgo placing the GWO tracking scripts. Instead, you need only place the control script and user defined variable script in any pages which have experiment variations. In fact, you can place these scripts on pages which do not have any experiment variations. The effect of doing this is that visitors to these non-variation pages will have their experiment variations chosen for them before they encounter any test pages. This means that you will be able to track their activity before they encounter experiment variations.

This is useful in that, if you do this at every entry point to your site, all tracking events will be tagged with experiment information.

Report Segmentation

So, once you get all this set up, how do get experiment segmented reports? On each report, there is a Dimension dropdown:

This will show a report broken down by the values of the user defined variable. You can use this to get an idea of how individual variations of your site influence this GA stat.

Note that this is not like a GWO report. These are just raw numbers and are not analyzed for confidence with respect to which variation is statistically significant.



Hi Erik,

Thank you for the excellent post. I’ve been using ROI Revolutions getCombo script, but your edits and using User Defined field would result in an overall cleaner output (not duplicating pageviews for example)

I’ve tried to implement your method using external files but with no luck as of yet (please forgive my ignorance).

I’ve 1) Activated the var _utcp 2) Run the MVT Head Script 3) Started and Ended the section swap 4) Run the standard GA 5) GWO Track Variation Pageview.

However, the custom variable does not seem to pass:

Hey Eric,

What do you think about using Event Tracking once it is available for everyone?

Also nice updates to the ROIRevolution code, but I’m woundering if it should be altered by removing the query params which represent the combination and experiment (and be replaced with some made up virtual path name), because if the “Exclude URL Query Parameters:” GA feature is used there will be multiple pageviews recorded for a single pageview of the test I believe.

Very interesting post!

I’m curious, how would this code need to be modified to accomodate the Website Optimizer tracking script being executed via an onclick event, such as from clicking a link? In that case, the ga.js tracking would already have occurred, right?

Am I correct in presuming this method would not work with onclick?

If not, what method would allow you to track onclick to Website Optimizer and both onlick and page views to Analytics?

This method is not intended to work with the GWO tracking scripts. It is meant to tag GA tracking events with the variation chosen for a visitor to an experiment. If you want to track links in GA, you can do that with event tracking or call trackPageView with a synthetic URI. The GWO visitor information will travel along with that GA event and you will able to track the performance of those events per variation in GA.

Using this experiment with the old Urchin GWO allows you to track both the GWO Conversions and Google Analytics User Generated Fields.

Also, I found that it was possible to track conversions for GWO and GA by using GA onclick for example, and placing the GWO conversion code in a div wrapper around the GA Code.

As I understand this trick will not track multiple experiments that might be running in parallel, since only one combination id is recorded.

I’ve been thinking of using the __utmv cookie value instead, so that’s set with setVar. This value records all experiment ids and combinations. A small catch though is that is doesn’t include the combination id, but the permutation string, e.g. 1-3-2 for a test with 3 sections and where each number represent the chosen section variant.

What do you think about this suggestion?

This is a fine idea. You can get the combination ID by calling utmx(“combination”).

Hello Eric,

have you ever tried to get equivalent reports from IndexTools (Yahoo Web Analytics) instead of GA?

kind regards,


I have not tried other analytics products, but as long as you can inject segmentation information, you should be able to do something similar.

Hi Eric,

I would suggest moving the pageTracker function outside of the utmx function check. That way if they have GA on the page but GWO failed, they still get a GA tracking hit.

Here is the code:

(function(){try {
var l = document.location, s =;
if (utmx(‘combination’) != undefined) {
s = s + (s.length ? ‘&’ : ‘?’) + ‘combo=’ + utmx(‘combination_string’);
s += ‘&test=TestName’;
var pageTracker = _gat._getTracker(“UA-XXXXXX-Y”);
pageTracker._trackPageview(l.pathname + s);


Nice input Erik

I have a couple of comments:

using _setVar
As you know this is a persistent cookie, so when the visitor is labelled on their first visit as e.g. “MyExpertiment-0-1-3”, on their next visit they will already have this label – even if they receive a new combination. So returning visits must be discarded with this method.

Experiment without GWO reports
“This means that you will be able to track their activity before they encounter experiment variations.”

With this method, have you given any thought to catering for visits who do NOT go on to see a test page?

As a response to those above who are talking about the duplicate pageviews that result from adding the combination as a query parameter (via the original ROI Revolution code that I wrote):

First, our code did not inflate pageviews (the very first version two years ago may have), as it simply adds a query parameter to the end of the GA code that’s already on the page. Second, we have always recommended excluding this information from your main profiles and only looking at the data in a separate profile used only for GWO – preferably using the Site Search report.

We’ve certainly used the User Defined variable as well, although we tend to use that variable for other things.

Thanks for the code updates Eric! It’s nice to see this go from a hack to a cleaner and more official implentation.

In response to Brian’s post on Oct. 4: GWO does not choose a different variation for returning visitors. So setting the var again does not hurt. And, if a visitor does not go on to a test page after executing the control script, the visitor does not see a variation, so I see no need to track them.

In response to Ophir on Sept. 23:

That seems like a fine idea!

Is this happening only to me? The last few days integration does not work as it used to work.

Multiple Goals

Today in GWO you can track, for example, when a visitor purchases something and/or signs up for a newsletter. This is done by simply executing the goal tracking script for both of these events. However, some experimenters want to track these kinds of things independently: how do site variations influence purchases independent of sign ups and vice-versa, for the same visitors.

For the purposes of the description of this technique, I will assume a multi-variate experiment is being used. Later on, I will describe how to adapt the technique to an A/B experiment.

The experiments I created for my sample test.

Begin by creating two nearly identical GWO experiments (I recommend giving them the same name, followed by ‘1’ and ‘2’, respectively). The first experiment should be set up normally. Assuming an MVT experiment, put the control script at the top of the test page, the tracking script at the bottom, and the section scripts throughout. Also place the goal tracking script at the bottom of the your first goal page (or wherever you want the first goal to fire).

You can then validate the first experiment. and continue on to adding the variations for the experiment’s various sections. Once done with that, you can progress on to the launch page, but do not launch the experiment at this time.

Then create the second experiment for the same test page and the second goal page. Do not place the control script for this second experiment in any pages. Do place the test page tracking script from the second experiment just after the test page tracking script for the first experiment. You will not want to modify the section scripts in any way. Do place the second experiment’s goal tracking script at the bottom your second goal page (or wherever you want the second goal to fire).

When you try to validate the second experiment, because the control script is not present on the test page, it will fail. Simply click continue in the validation dialog box, then click the continue button on the install page (which should now be enabled). You will be presented with a confirmation dialog because validation did not complete. You can also upload temporary test and goal pages for the second experiment to get past validation (this will be required for A/B experiments because the A/B experiment wizard does not allow for skipping validation at this time).

For the second experiment you will need to create the same number of variations for each section as the sections in the first experiment. However, you will not need to fill in any alternative content, just leave the variation content alone for variations in the second experiment, that content will not be used. You should name the variations in the second experiment the same as the first. You should then launch the second experiment. It will not yet have any effect.

Then, place the following custom script immediately after the control script for the first experiment:

(function() {
function set_cookie(name,value,timeout) {
if (_udn && _udn != "") value += ";domain=" + _udn;
value += ";path=" + _utcp;
if (timeout == 0 && _utimeout && _utimeout != "") timeout = _utimeout;
if (timeout > 0) value += ";expires=" + (new Date((new Date).getTime()+timeout*1000)).toGMTString();
document.cookie = name + "=" + value;
function CopyExperiment(src_key, dst_key, dst_id, dst_g, cookie) {
var cs = document.cookie.split(‘;’);
for (var i = 0; i < cs.length; i++) {
var c = cs[i].split(‘=’);
var s = c[0];
while (s.length > 1 && s[0] == ‘ ‘) s = s.substr(1);
while (s.length > 1 && s[s.length – 1] == ‘ ‘) s = s.substr(0, s.length – 1);
if (c.length == 2 && s == cookie) {
var es = c[1].split(‘.’);
var d = 0;
var dv = “”;
for (var j = 1; j < es.length; j++) {
var ek = es[j].substr(10, 10);
if (ek == dst_key) {
d = j;
} else if (ek == src_key) {
if (dst_g.length > 0) {
dv = dst_id + dst_key + ‘:’ + dst_g + es[j].substr(22);
} else {
dv = dst_id + dst_key + es[j].substr(20);
if (dv.length > 0) {
if (d == 0) {
} else {
es[d] = dv;
set_cookie(cookie, es.join(‘.’), 63072000);
}// Custom variables, adapt them to your own experiments
var src_key = ‘3923492669‘;
var dst_key = ‘4234772301‘;
var dst_id = ‘0000370338‘;
var dst_goal = ‘2‘;
CopyExperiment(src_key, dst_key, dst_id, dst_goal, ‘__utmx’);
CopyExperiment(src_key, dst_key, dst_id, ”, ‘__utmxx’);

You can obtain a copy of this script from the source of my example test page.

First note the four customizable variables near the end of the script. You will need to modify these to adapt the custom script to your new experiments. The effect of this script will be to copy experiment information from the GWO cookies for the first experiment to that for the second. The first three numbers need to be quoted, zero padded 10 digit numbers. The last is a single quoted digit.

For the value of src_key, substitute the experiment key for your first experiment. You can find this key near the beginning of the first experiment’s control script:

function utmx_section(){}function utmx(){}
(function(){var k='3923492669',d=document,l=d.location,c=d.cookie;function f(n){
if(c){var i=c.indexOf(n+'=');if(i>-1){var j=c.indexOf(';',i);return c.substring(i+n.
length+1,j<0?c.length:j)}}}var x=f('__utmx'),xx=f('__utmxx'),h=l.hash;
d.write('<sc'+'ript src="'+
+new Date().valueOf()+(h?'&utmxhash='+escape(h.substr(1)):'')+
'" type="text/javascript" charset="utf-8"></sc'+'ript>')})();

Remember that in the custom script above, this must be enclosed in quotes and be a 0 padded, 10 digit number.

The dst_key is similar to the src_key; it can be found at the same place in the second experiment’s control script.

The dst_id and dst_goal are a bit more difficult to find. To obtain them, load the following URL into Firefox, or use the curl command (other browsers, like IE, will not show you the source of the returned script). Be sure to substitute the 10 digit utmxkey with the experiment key of your second experiment (the value of dst_key).

This URL will return a chunk of JavaScript which, probably around the third line (which will be rather long), will contain something that looks like:


The single digit located between colons is the value you will need for dst_goal. In this example it is 2. Find this value and substitute it for the one in the custom script. Make sure this is the set_cookie call for __utmx, not __utmxx.

Then, locate the 20 digit, zero-padded number, just before the number for the dst_goal. The first 10 digits of this number is what you’ll use for the dst_id. Substitute this number for the one in the custom script.

You can now preview the first experiment to make sure it looks good, and then launch the first experiment. You should now have two new experiments, each of which are running. If you need to pause the multiple-goal experiment, you need only pause the first experiment.

Example of GWO’s __utmx cookie created when visiting sample test page.

When a visitor loads a test page, after the control script has run and has updated the GWO cookies to contain information about the first experiment (which combination the visitor is to see, most specifically), the custom script will execute and update the GWO cookie to mimic, for the second experiment, the combination chosen for the first experiment. This way, both experiments, on a per visitor basis, will be in sync with each other. However, because the two experiments have different goals, visitors will tracked differently with respect to these goals.

The report for the first experiment will show you how the first goal converts for all the users of the experiment, while the report for the second experiment will show you how the second goal converts for the exact same set of visitors.

Note that you will need to perform the preceeding customization for any page which is tested. That is, anywhere you would have placed the control script, you need to also include the custom script above, and anywhere you would have placed a test page tracking script, you will need to have both test page tracking scripts for both experiments.

With respect to goal page tracking scripts, these are treated exactly as any normal experiment. For example, you can trigger the test page tracking script in response to a click event. There is no need for customization of the goal tracking scripts.

Example Files

I have constructed an example using this technique. You should be able to visit the test and goal pages and see your cookies reflect the presence of the two experiments (which are actually running live):



Customizing A/B Experiments

If you want to use this technique for an A/B experiment, you will perform a very similar setup. The only modification you will need to make is to edit the control script of the first experiment in your test (A) page. At the end of the control script for an A/B experiment, you will find the following script:


You will need to remove this from the control script and place it to be after the custom script. This script is executed to perform the redirection to alternative pages, and needs to take place after the editing of the GWO cookies for multiple goals.


I have noticed that the reports for the two experiments can sometimes show a different number of visitors for the same combinations. But I have found that over time the two reports synchronize.

Because the two experiments have different goals, they will probably not recognize the same winners, or have the same amount of confidence. You will have to choose which combination, on whatever experiment, satisfies you.

Also, because you are measuring two goals simultaneously, you should wait for a combination for either goal to become a particularly clear winner before acting upon it. The idea behind this is that you are waiting for two events to show significance without controlling for it. One of two events has a better chance of looking significant than one alone, so wait a bit longer for a winner to take a particularly significant lead.

Future Compatibility

This technique replies on the fact that, currently, the control script creates and edits GWO cookies (actually, siteopt.js does this, but the control script fetches siteopt.js). I have been considering changing this so that instead of the control script manipulating cookies, the test page tracking script will do this. This probably won’t happen for a while, but when it does, you will have to modify this technique slightly.

How will you know when this may be the case? I will probably modify the control script to have a new version number. Here is the line from the control script which would be modified:


The new version number will be 2 (or greater), indicating that the control script is different and may no longer be setting cookies. You can verify this by making the above request for siteopt.js for one of your new, running, experiments. I recommend the second experiment which you launch early. You will want to check to see if it still performs a set_cookie call. It is also possible that the control script’s version number may not change, and siteopt.js will simply not set cookies for new experiments. Thus, you’ll just need test to see what the situation is.

When the control script no longer edits cookies, there may be three ways to continue to use this technique (assuming GWO has not already built in multiple goals):


  1. Place the custom script to come after the first experiment’s test page tracking script, which will have created the GWO cookies. Then, when the second experiment’s tracking script is called, the custom script will have edited them properly.
  2. Revert the version of the control script back to the last version which still has it setting cookies.
  3. Instead of the control script editing GWO cookies it will probably edit a the GA global variable to contain the new values to be assigned to the GWO cookies which the tracking script will set. Edit the values in the GA global variable.

In any case, until this change takes place, I’ll not know what the correct workaround will be, so make sure you test this technique before using it. When I do effect the change, I’ll try to remember to update this article. Remind me if I’ve forgotten 🙂

More than Two Goals

This technique should also generalize to more than two goals. Simply create a new experiment for each additional goal and treat it in a similar fashion as the second experiment above.

Stopping, Pruning, Creating Copies and Follow Up Experiments

To stop this kind of experiment, simply stop both experiments at the same time. Also, when making copies or creating follow up experiments, simply perform the same operation on both experiments.

When pruning, one need only prune combos from the first experiment. If you want, you can prune the same combos on the second. In general, performing any operation should be applied to both experiments.

Report Discrepancies

In general, the number of impressions that one sees for variations in both experiments should be exactly the same. However, because the experiments are, essentially, being tracked independently, differences can arise. There are a variety of reasons for this which are, to a great extent, out of your control.

For example, if a visitor to your site closes the window immediately after loading the site, one of the tracking requests might get through, while the other is canceled.

However, the discrepancies should either reconcile after a couple of days, or be small. For example, my example experiments currently read:

Showing a slight discrepancy. I would not be too concerned unless it starts to differ by more than, say, 5%.

Interpreting Reports

So, what does one do if a variation is a winner in one report and and a loser in another? Well, that’s up to you and how much you value each conversion. I would recommend running a multiple-goal follow up experiment. The more metrics you measure, the greater the chance that one of them may be misleading.




Man you rock Eric, I attempted this a while back, but gave up, thanks a ton.

Thanks for this, Eric! We’ll try this method out on the next multi-goal test.

Wow, so glad I stumbled upon your blog.

I have been trying to monitor two goals (email sign-up within cart and over site conversion) in the following manner.

1) Integrate GWO and GA using ROI code

2) Set up exp with conversion script as an onclick event (newsletter signup)

3) track overall site conversion by checking out index value for each combination.

This should allow for cleaner results.

While I’m here, I’ve got a question you may be able to answer. Are you using a Tukey’s multiple comparison for the reports? If not, you guys may be playing a little fast and loose when comparing different levels of a condition.

If I’m misunderstanding the stats used, feel free to enlighten me.

Keep up the great work!!

We do correct for multiple comparisons using the Bonferroni adjustment. We’ve looked into others, but they don’t offer that much more improvement over this conservative approach.

I tried this technique and it works great.

Few things which you might come across to see working version of this test are:

1) check if number visitors are same in both experiments, if not there is something messed up in java script. This tests if you have followed the process correctly.

2) while determining the dsd_id and dst_goal, if you don’t find the value instantly then clear the cookies and again try with the said url( in browser. it will give out the a big javascript. it is easier to do ctrl+f and find for value of your dst_key to determine the following dst_id.Also close any open experiment pages. clearing cookies and restarting browser helps to determine the dsd_id value easily.
Follow the steps as Erik has mentioned. start the experiment2 before experiment1 or even before looking for dsd_id.

3) When you are copying the experiment, the value of dsd_id changes, so once you copy the experiment again go through the described process to determine the new dsd_id and dsd_goal value. Need to do it again.

This was my personal experience. feel free to correct me.

Erik thanks again for the post.

Hi Erik,

Somehow my things are good in my prototype for test but when i rolled same procedure to an ecommerce site, the number of page visitors are different by 40, which ideally should be same in for both the tests. Is there any other way to check if tests are in synch. I am wondering is it because of lag I tend to see different numbers for page visitors.

Something that i did differently which i think shouldn’t affect this process were: My names of two experiments were quite different not like goal1 and goal2. and I kept the conversion script for goal2 in header.

if a test is not doing good on a variation in goal1 does good on goal2, what would you recommend– keep it or leave it or create follow up tests.

Osris, I’ll answer your questions in the post so others will more easily see them.

Thanks Erik, that will be really very helpful.

Erik, you got one of the most amazing blogs I’ve read recently… Thanks a lot to share your knowledge and intelligence 🙂

I will try this technique in my next experiment.

Keep up your great work!

Interesting. Often we have weights for each success event, is there any way to use some measure of total value as the dependent variable such that we are maximizing total returns?

Related to that, can Google’s website optimizer be set up to be adaptive? I think what is often desired is to select the best combination of attributes in order to max returns/min regret based on some specific reward function. While hypothesis testing is great for assessing experiments, it is not normally the best approach for optimizing action selection since it follows a uniform exploration policy in the action space.
Are there any plans to include some sort of adaptive learning behavior?

Matt Gershoff

The idea of comparing different arms of an experiment by using a function over multiple goals is one we’ve considered for GWO. The first step is to get multiples goals into the product 🙂

With respect to your second question, we’ve also considered the explore versus exploit aspects of experimentation. The idea of giving more traffic to arms which seem to perform better in the hopes that the average conversion rate (or whatever metric which one wants to optimize) during the experiment becomes higher is one we are considering as well.

Cool. Thanks!

Matt Gershoff

Hi, I am new here. Is there are to keep the test page layout the same for one customer each time the page refresh or back to this page again after moving to other pages?
Because we want to keep the layout consistent for one session.Thanks!

Hi Eric,
Thanks for this post. Exactly what we needed!
We are having a problem in the implementation though. Our test page does not show any conversion starting from the second experiment (we are setting up about 5 experiments). The first experiment is ok.
I know it is hard for you to comment on without seeing the code, but can you think of any obvious issues that might cause this?Thanks.

Hard to say what the problem is. I need to write an article about debugging techniques.

Not sure if I understand this correctly, so this may be a silly question. I would like to test two sales funnels – each with two goals – to see how the conversions are for goal 1 and goal 2.

I would like the user to initially be shown content that is either of two landing pages (L1 or L2). If the user gets to L1, then the goals would be S1 and P1. If the user gets to L2, then the goals would be S2 and P2.

Is this even possible? Are the preceding steps how to do this?

Andrew D. – The situation you describe is possible. The key is that the two goals for the two funnels should be instrumented with the the same conversion tracking scripts. That is S1 and S2 conversions use the same script for the first experiment and P1 and P2 use the same script from the second experiment.

posted by Blogger Eric Vasilik : September 10, 2009 1:05 PM
Hi Erik,
we are running a multi-goal A/B test for 2 landing pages. Everything is working fine except 2 things:
1. For some individual experiments, the control page is showing more visitors than test page. I understand that the visitor count between pages will not be completely 50-50 but i would expect that to be a small deviation from that, however the difference we see is about 60%.Why would this be?
2. I asked this before and you did reply saying it is difficult without seeing the code, which makes perfect sense. Our first experiment shows much higher visitor counts than other epxeriments. I think it is picking up the visitor counts from experiments that come after that (in terms of individual experiment tracking scripts placed after the first one)
just wondering if I can e-mail the code for you to look at. This is still not resolved and I am under the gun to provide an explanation :)Thanks.

Thank you so much for your quick answer. You have created a wonderful resource here!

As always – great info!

Personally, I prefer to integrate the variation number into GA and just use the GA goals (yes, I need to calculate the results myself).

Keep up the great work!