Experiments

Last Updated: Nov 30, 2023
documentation for the dotCMS Content Management System

The Experiments feature gives users the power to create variants of a Page and compare their performance against defined goals. Once an Experiment is finished, the user can then promote the winning variant to become the new main version of the Page.

Through the use of Experiments, you can refine your site's presentation through user interaction, testing updates for relative engagement levels and using that data to help decide which changes best serve both you and your audience.

Note: While the feature is present for all enterprise customers using dotCMS 23.10 or later, it will be inactive unless we manually configure the Analytics App for the customer, the latter of which is not configured by default. To enable Experiments and the Analytics App, contact your customer success manager.

Experiments Pane.

Usage

To begin an Experiment, access any Page in one of its viewing modes — such as through the Pages Tool.

On the right-hand menu bar, the button labeled A/B with the icon of a beaker (beaker icon.) takes you to the Page's Experiments pane. There, two controls are present at all times.

First, a dropdown allows filtering on the basis of Experiment states; check as few or as many of these states as you wish to view the corresponding Experiments.

Experiments filter.

Secondly, across from the dropdown menu is the Create a New Experiment button; click this and enter a name to begin configuring an Experiment.

Create new experiment.

Each Experiment consists of several component parts: variants, goals, a traffic allocation scheme, and a schedule. Even before an Experiment finishes, a results pane provides summary data visualizations for the data collected so far.

Variants

Every experiment must have either one or two variants defined alongside the original page. To add a new variant, click the Add New Variant button, and then assign it a name.

Variants listing.

Once the named variant appears on the list, you may assign that variant a weight. This determines what random percentage of traffic served the Experiment will receive that variant. By default, all variants are given equal weight, but the user may instead assign custom weight percentages.

Each variant may also be edited independently; clicking Edit will take you to Edit Mode, treating the variant as an entirely separate Page. You may add new content, modify or remove existing content, or change the Page's layout without affecting the original Page.

Goals

An experiment must also have a goal — the objective of the Experiment, according to which the results are evaluated.

A goal can be established from a number of available metrics:

MetricParametersDescription
Bounce RateN/AA Bounce occurs when the Page inside the Experiment is the first and the last Page of the session; the user landed on the Page and didn’t click a link to produce further navigation, or perform any other activity.
Exit RateN/ASimilar to Bounce Rate, An Exit takes place when the Page inside the Experiment is simply the last Page of the session, rather than the last and only one.
Reach PageA URL or partial-URL patternGauges success according to whether a user visits a particular Page in the course of their session.
URL ParameterA URL parameter or partial patternTracks whether a user accessed the Page with the appropriate URL query parameter present.

Goals selector.

Traffic Allocation

Traffic allocation is an optional element of an Experiment. This describes is the percentage of the user traffic that will take part in the Experiment.

This works in tandem with the weights described under Variants. When receiving traffic, the traffic allocation determination is performed first; if the traffic is determined to take part in the Experiment, only then are variant weights factored.

In other words, if 50% of traffic is allocated to the Experiment, and the Experiment consists of a 50% weighted split between an original and a variant, then any given visitor has a 25% chance to see the variant instead of the original.

Traffic allocation.

Scheduling

Another optional element, scheduling allow the user to define start and end dates and times for the Experiment. You may choose to set a start, an end, both, or neither. Each result yields slightly different behaviors.

StartEndResult
Allows a maximum Experiment length of 90 days.
An end date & time will automatically be set to give the Experiment the default length of 14 days.
A start date & time will automatically be set to give the Experiment the default length of 14 days.
The Experiment must be commenced and ended manually.

Scheduling selectors.

States

Every Experiment is always in one of the following states:

  • Draft
  • Scheduled
  • Running
  • Ended
  • Archived

When you view the Experiments list for any page, by default it will show all Experiments for the Page in any of these states except Archived. You can change the filter on the Experiments list to show or hide any of these states (including Archived).

State indicator.

Actions

Users can perform a number of actions on Experiments once the mandatory configuration is completed. Actions may move the Experiment into a different state. The actions available for an Experiment can be viewed through the “hamburger” buttons () at the right side of the Experiments list.

Actions menu.

ActionRequired StateNew StateDescription
Edit/View ConfigurationN/AN/AAllows you to view — or edit, if it has not begun — the configuration of the Experiment.
View ResultsN/AN/ABrings you to the result charts once an Experiment has accrued data.
Start ExperimentDraft / ScheduledRunningBegins the Experiment immediately, for the default length of 14 days unless otherwise specified.
Schedule ExperimentDraftScheduledAllows scheduling of the Experiment per the section above.
End ExperimentRunningEndedEnds Experiment. Allows immediate viewing of results, promotion of winner, etc.
Abort ExperimentRunningDraftHalts the Experiment and discards all data gathered during its run.
ArchiveEndedArchivedArchives the Experiment; this removes it from the default list, and renders its results no longer viewable.
Cancel SchedulingScheduledDraftRemoves scheduling from the Experiment.
Push PublishN/AN/ASee section below.
Add to BundleN/AN/ASee section below.

Push Publishing

Experiments will not be push published when the Page is pushed. To push an Experiment, you must explicitly push the Experiment itself, using the blue hamburger button () at the top of the Experiment.

When you push an Experiment, the state of all the Experiments on the page will be synchronized from the sender to the receiver. This means, for example, that if you stop an Experiment on the sender, then start a new Experiment on the sender, and then push the new Experiment, then the new one will be pushed and started while the old one will likewise be stopped on the receiver.

Push Publishing Dependencies

The Page is a push publishing dependency of the Experiment. If an experiment is pushed, but the Page was not already pushed, the Page will be pushed with the Experiment. However, the Experiment is not a dependency of the Page; so, the Experiment is never pushed unless it’s explicitly added to the bundle.

Pushing Experiment Data

The data for an Experiment is shared among all servers the Experiment exists on.

Since the state of the Experiment is always synchronized between the sender and receiver, this means that the Experiment must be running on the sender while it’s running on the receiver.

This means that if a user visits a page with an Experiment on the front end of the sender, they’ll add data to the experiment that’s indistinguishable from data collected from the same page on the receiver — i.e., the datasets merge during synchronization.

Note: This arrangement can result in users on the sender being able to view the data being collected on the receiver in real time, regardless of permissions on the receiver.

Results Pane

The Results Pane provides data visualization in the form of interactive charts and a summary data table, providing different views of the same data.

Results pane.

Along the top of the Results Pane, you can see the current “winner” of the Experiment, as well as its goal and run dates, as well as the number of sessions it has recorded so far. These can be refreshed via the button immediately to the right.

The default chart, Daily Results simply shows the conversion rate for each day for each variant. Hovering your cursor over the chart will display the value at the indicated datapoint.

Daily Results chart.

Bayesian Results

The second chart tab displays Bayesian Results for the original and each variant.

Bayesian inference is an advanced method of analyzing the data based on prior beliefs repeatedly updated by data illustrating likelihood, generating an increasingly accurate posterior probability. Bayesian inference can provide a high degree of confidence on which variant will perform better, and can do so in a relatively short time.

Bayesian inference operates from a different philosophical starting point than frequentist inference, which emphasizes long-run frequencies of repeated events with fixed parameters.

Bayesian Results chart.

The Bayesian chart shows curves reflecting probability distributions. In calculating these, Experiments automatically choose a value for the prior, which assumes no knowledge of what the actual conversion rate might be.

In the data table, the two rightmost numeric columns are both calculated along Bayesian lines:

  • Probability to be Best gives the probability that each variant is the one likely to result in the highest conversion rate;
  • Conversion Rate Range specifies a range indicating the lowest and highest values the actual conversion rate is likely to have, with a 95% confidence interval.

Experiment Data Is Immutable

Due to the nature of the statistical method used, once data is collected for an Experiment, it cannot be removed from that Experiment without fully invalidating or resetting the Experiment (as with the Abort Experiment action). For example, you cannot elect to change the start date after the fact to exclude some test data you generated at the beginning of the Experiment.

If you collect data that you do not wish to include in the Experiment, the only way to remove it is to abort and restart the Experiment.

Configuration Properties

Certain Experiment settings can be adjusted globally by editing configuration properties.

Duration

Experiments are limited to a specific minimum and maximum duration, and there is a default duration that an Experiment will use if a specific end date has not been scheduled. Each of these values may be changed.

Their default values, integers representing a number of days, are as follows:

EXPERIMENTS_MIN_DURATION=8
EXPERIMENTS_MAX_DURATION=90
EXPERIMENTS_DEFAULT_DURATION=14

Auto-Inject

In order for Experiments to track conversions on a Page, a block of JavaScript must be included in the Page. There are two ways to include the code: manually — by editing the Theme, Template, or Page — or automatically via the ENABLE_EXPERIMENTS_AUTO_JS_INJECTION property.

DOT_ENABLE_EXPERIMENTS_AUTO_JS_INJECTION=true

If the ENABLE_EXPERIMENTS_AUTO_JS_INJECTION property is set to true, then the appropriate code will be added to ALL pages on the dotCMS server — even pages which do not have an Experiment on them, and which don’t have any bearing on the conversion for any Experiments.

If this property is set to false (the default value), then the user must manually add their code to both the Page the Experiment is running on, and any other Pages which may be involved in the conversion — for instance, the Page a user is intended to reach with the Reach Page goal.

Note: The auto-inject option is disabled by default; depending on the sorts of code present or expected on a certain Page, automatic injection may in some cases result in errors. Test before enabling.

Lookback Window

The lookback window is the length of time (in days, as an integer) a site visitor will be remembered within the Experiment. A remembered user will be served the same variant they were originally served on their first Experiment visit.

After the lookback window expires, a user visiting the Page may be routed to a different variant than they were the first time.

EXPERIMENTS_LOOKBACK_WINDOW=10

On this page

×

We Dig Feedback

Selected excerpt:

×