This document describes the advanced features and implementation considerations for experiments in Google Analytics.
The Google Analytics Content Experiments Framework enables you to test almost any change or variation to a website or app to measure its impact on a specific objective; for example, increasing goal completions or decreasing bounce rates. This allows you to identify changes worth implementing based on the direct impact they have on the performance of your property.
To help you understand how Google Analytics Experiments work, the following sections discuss concepts and considerations for:
- Configuration — creating and configuring experiments.
- Collection and Processing — choosing variations and sending experiment data to Google Analtyics.
- Reporting — experiment dimensions and metrics that are available for reporting.
To run a Google Analytics experiment there are a few conceptual things you need to consider such as what to test and the experiment variations you plan to show users. Next you need to decide if experiment optimization decisions will be managed by Google Analytics or by an external service, and based on that decision determine the experiment objective you want to optimize. Finally, once you create and run the experiment, you will have an experiment ID and a list of variations with Ids that can be used to serve the experiment to users.
The following list of concepts need to be addressed before creating a new experiment:
- Defining the Variations
- Setting the Serving Framework
- Defining the Objective Goals/Metrics
- Running the Experiment
Defining the Variations
Defining variations is dependent on what you want to test. It could be a
single element on a website page, or an entire page, the text size of an offer
on a kiosk screen, or the result set of a database query. It is up to you how
to define a variation. The only requirement when creating an experiment in
Google Analytics is that you provide a
name for each variation.
Google Analytics represents variations as a list,
using a 0-based index, where the first element in the list is the "Original".
The list of variations for an experiment is always in the same order and
cannot be modified once an experiment is running. The index of
a variation is used as the
Setting the Serving Framework
Optimization decisions for an experiment can be handled by Google Anaytics or
externally by you or a service. You can configure this option by setting the
servingFramework field when you create an experiment using the
Setting the serving framework also affects the value of the
The serving framework and use case for each option are:
REDIRECT(default) – You intend to rely on Google Analytics to perform client-side redirects to show users the chosen variation. This is the same serving framework used when experiments are created using the Google Analytics web interface. So the only difference in this case is that you are creating the experiment via the Management API.
API– You will choose variations and handle displaying variations to users but will still rely on Google Analytics to handle experiment optimizations. For example, this should be used if you want to eliminate client-side redirects but still rely on Google Analytics to run experiments. This is typically for when the Management API is used to choose variations server-side in cases where dynamic content is being served to users.
EXTERNAL– You will choose variations, handle experiment optimization, and only report the chosen variation to Google Analytics. For example, this should be used by 3rd-party optimization platforms that want to integrate with Google Analytics for reporting purposes. In this case, the Google Analytics statistical engine will not run.
For additional details on the
servingFramework property see the
Experiments API Reference
Defining Objective Goals/Metrics
servingFramework of an
experiment is set to
EXTERNAL then the experiment objective and
decisions should be handled externally by you or another service. In this case
you are not required to define an objective when creating an experiment.
REDIRECT then the responsibility of
handling experiment decisions is managed by Google Analytics. For this case
your are required to define an experiment objective to optimize.
This objective is what drives the experiment towards a final outcome and is
why it's important that Google Analytics tracking is properly implemented
on the property for experiments to work. If Google Analytics isn't implemented
then it will be impossible to determine when an objective has been reached.
The objective can be to minimize or maximize a goal or a predefined metric
such as time on site, or page views.
If you're using a
as an experiment objective, note that the goal must exist before it can be
used. Also, goals in the
experiments web interface
are shown with a name and number that follows the pattern Goal Name
(Goal (n) Completions), where
(n) is the goal number
(from 1-20). However, when managing experiments using the
API, goals are represented with the metric
ga:goal(n)Completions. For example, to use
Goal 2 Completions with the API, set the
experiment objective as
For a list of metrics that can be used as experiment objectives, see the Experiments Reference
Running the Experiment
Once an experiment is properly configured and running, you will have an experiment Id and a list of variations, each with its own variation Id. Using this information it is now possible to begin serving an experiment and sending data to Google Analytics.
The next step is to start sending data to Google Analytics using a collection API/SDK.
Collection and Processing
As users interact with a property and are exposed to a running experiment,
the decision about which variation to show a user depends on whether or not
they have been previously exposed to the experiment. If they are returning
then show the variation previously chosen, otherwise if they are new to the
experiment choose and show a variation. Once it's been determined what
variation to show the user, the
experiment Id and
variation Id are sent to Google Analytics along with a hit such
as a pageview, event, transaction, etc.
Finally, for collection and processing to work properly for experiments, Google Analytics has to be implemented on the property so that the experiment objective is getting measured. This will ensure that if a user is exposed to an experiment and at some point reaches the experiment objective it will be attributed to the correct experiment ID and variation shown to the user.
The following list of concepts need to be addressed to measure experiments in Google Analytics:
- Choosing a Variation for New Users
- Choosing a Variation for Returning Users
- Sending Experiment Data to Google Analytics
- Experiment Conversions and Outcomes
Choosing a Variation for New Users
When a new user is exposed to an experiment running on your property you
need to choose a variation (if any) to show the user. The responsibility of
this decision is based on which
configured for the experiment.
REDIRECT– In this case, Google Analytics makes the the decision and uses a multi-armed bandit approach to evaluate and manage experiments.
EXTERNAL– In this case, the Google Analytics statistical engine is disabled and it is your responsibility to evaluate and manage experiment decisions. In other words, any approach can be used to choose variations for users.
In either case, the final outcome of this step is that a variation will
be chosen for the user and you will have the
variation Id. The
next step is to send the experiment and variation ID to Google Analytics.
Choosing a Variation for Returning Users
When users interact with your property and are exposed to an experiment for the first time, a choice is made to decide what to show the user. However, once that variation choice is made for the user it should remain the same for subsequent exposures to the same experiment. For this reason it is necessary to anonymously store experiment details for a user in a secure but accessible location. When the user returns to the property you can retrieve the ID of the variation previously chosen for the user and take the appropriate action.
Since you have the
variation Id for the user, the
next step is to send the experiment and variation ID to Google Analytics.
Sending Experiment Data to Google Analytics
For example, if a website has an experiment running on a single page of the website, experiment data needs to be sent to Google Analytics each time a user is exposed to that single page, even of they are a returning user. For pageviews on other parts of the site where no experiments are running, it is not necessary to send experiment data to Google Analytics. Keep in mind that returning users should be shown the same variation each session and that in all cases you need to ensure that Google Analytics tracking is implemented properly to measure when objectives are reached by users.
Once an experiment variation is chosen the following experiment data needs to be sent to Google Analytics:
- Experiment ID — the ID of the experiment the user has been exposed to.
- Variation ID — the index of the chosen variation shown to the user.
The experiment ID and variation ID are sent to Google Analytics as parameters attached to other hits, like pageviews, events, or ecommerce transactions. For details on how to send experiment data along with hits during collection, review the experiments developer guide for the collection API/SDK you're using.
The processing of experiment data continues until a winning variation is declared by Google Analytics, or the experiment is manually stopped or expires with no clear winner.
Experiment Conversions and Outcomes
The process of determining the outcome of an experiment depends on the
servingFramework configured for
EXTERNAL– For this case an experiment objective is not required when creating an experiment in Google Analytics since this should be handled by you or an external service. The outcome of the experiment is not managed by Google Analytics so it is up to the the external service to make decisions on the outcome of the experiment.
REDIRECT– In this case Google Analytics manages the experiment so it will determine the performance of variations and determine the outcome of an experiment. It requires that the experiment have a defined objective. In addition, conversion events and metrics related to the objective need to be measured. For example, if the experiment objective is an event based goal then you need to make sure event tracking is properly implemented on the propery so that this goal can actually be achieved. This also applies to objective metrics. For example, a transaction objective requires ecommerce tracking to be implemented.
When a goal/objective is reached or completed by a user, Google Analytics will automatically determine, during processing, which active experiments the user has been exposed to and if a match is found, the conversion will be attributed to the experiment and the specific variation shown to the user.
If experiments reports are not being populated with data you should confirm that Google Analytics tracking is properly implemented on your property and that goals and conversions are being measured as expected.
Experiments APIs and Libraries
There are several different APIs and libraries available to help you perform the various operations that are required as part of an advanced experiments implementation:
The Management API can be used to create/insert new experiments, update existing experiments, retrieve a list of experiments, get the details of a single experiment, and delete experiments. The insert, update and delete operations require the authorized user to be an admin of the property. For more information see the Management API Reference, and Developer Guide .
As part of the collection process for experiments, you will also need to ensure that Google Analytics measurement is properly implemented on your property.