AB tests

The Didomi SDK allows you to run AB tests to compare the consent rates and other metrics of two groups of populations. We support running AB tests through your preferred vendor or through the Didomi platform.

Run AB tests with Didomi

The Didomi SDK allows you to test the performance of a variant of your configuration (notice format, content, etc.). To do so, our SDK will separate your users into two groups:

  • Control group (A): control users will be shown your normal consent notices and the current configuration of your SDK will be applied.

  • Test group (B): test users will be shown the variant of your SDK configuration that you want to test.

To validate whether the outcome of your AB test is positive or negative, you will observe the metrics (consent rate, bounce rate, etc.) of your test group and compare them with the metrics of your control group. If your test group performs better than your control group then your AB test will be positive and indicate that the configuration applied to your test group is more efficient than the one applied to your control group.

How about multivariate tests?

Multivariate tests allow you to create multiple test groups to analyze the performance of multiple variants at once. Didomi does not support multivariate tests at the moment. You can use AB testing vendors like Google Optimize (a free solution) to create this type of tests.

Configuration

To setup your AB test, you need to add an experiment property in your didomiConfig variable. Keep in mind that all properties outside of that experiment property will be applied to your control group as usual. That is the baseline that you will be comparing your test group results against.

The experiment property allows you to configure three variables of your test:

Property

Description

Example

id

A unique ID for your test. That will be used to distinguish the results between multiple tests that you would be running over time. Use a lowercase string with no spaces and only - (dash) as a special character ([a-z-]).

first-test, test-variation-1

size

The size of your test group as a percentage between 0 and 1. Your control group size will be 1 - size. We recommend keeping this value between 0.1 (10%) and 0.5 (50%) so that you control group will include 50% to 90% of your users.

0.2 for a test group that includes 20% of your users

config

The extra configuration parameters to apply to the test group users. The configuration parameters will be merged into your regular configuration (the one applied to your control group) to allow you to only test a small number of variables at once. Any configuration parameter that is valid for the SDK can be used here.

See below

startDate

Start date of the test as an ISO 8601 string. When provided, only users that are asked for consent after that start date will be included in the test.

While this is not required, it is strongly recommended to include a start date to make sure that your test and control groups only include new users that have not seen the consent UI yet.

2019-03-06T23:38:50Z

Keep your test focused

It is important to not test too many variations at once as you will not be able to attribute the performance results of your test group to a specific variable.

For instance, if you change both the notice content and format of your test group (compared to your control group) and see a consent rate increase of 10% in your test group, you will not be able to determine if the consent rate increase comes from the different notice content or format. You will only be able to conclude that changing both the notice content and format had a joint positive effect.

<script type="text/javascript">
window.didomiConfig = {
app: {
apiKey: '<Your API Key>',
name: 'My website',
vendors: {
iab: {
all: true
}
}
},
notice: {
content: {
notice: {
// This notice message will be displayed to users in your control group
en: 'Variation #1 of the message displayed in my consent notice'
}
}
},
experiment: {
id: 'test-notice-message', // Unique ID for your test
size: 0.2, // Test group will include 20% of your users
startDate: '2019-03-06T23:38:50Z',
config: {
// Every property in this config object will be merged into window.didomiConfig for users in your test group
notice: {
content: {
notice: {
// This notice message will be displayed to users in your test group
en: 'Variation #2 of the message displayed in my consent notice'
}
}
}
}
}
};
</script>

Keep in mind that users in your test group will see the default configuration for your tag PLUS the specific configuration for them (that will replace the default configuration for properties that appear in both) For instance, they will see all the IAB vendors and the message "Variation #2 of the message displayed in my consent notice". Users in your control group will see all the IAB vendors and the message "Variation #1 of the message displayed in my consent notice".

<script type="text/javascript">
window.didomiConfig = {
app: {
apiKey: '<Your API Key>',
name: 'My website',
vendors: {
iab: {
all: true
}
}
},
notice: {
// Users in your control group will see the consent banner at the bottom of their screen
position: 'bottom'
},
experiment: {
id: 'test-notice-format', // Unique ID for your test
size: 0.2, // Test group will include 20% of your users
startDate: '2019-03-06T23:38:50Z',
config: {
// Every property in this config object will be merged into window.didomiConfig for users in your test group
notice: {
// Users in your test group will see the consent popup instead of the banner
position: 'popup'
}
}
}
};
</script>

Keep in mind that users in your test group will see the default configuration for your tag PLUS the specific configuration for them (that will replace the default configuration for properties that appear in both) For instance, they will see all the IAB vendors and the pop-in consent notice. Users in your control group will see all the IAB vendors and the banner consent notice.

Analysis

We will provide a dashboard to analyze the results of your AB tests. Send an email to support@didomi.io to get us a dashboard and indicate your test ID (the one configured in experiment.id).

The dashboard will indicate the results for both your control group and your test group, allowing you to compare the performance of both groups to measure the impact of your variation.

Run AB tests with your preferred vendor

If you already work with an AB testing solution, you can use it to deploy different variants of the Didomi tag configuration and automatically measure the difference in consent rates.

You create, configure and analyze your AB test entirely in your AB testing solution. You will create the variants that you want to test using the standard process of your vendor and will use your testing solution to collect and analyze the result of your tests.

Example for comparing two different messages in your consent notice:

  • Create an AB test with two groups: control and test

  • Configure the test group to show a different notice message

  • Add an event to count the number of clicks on the "I agree" button of the consent notice

  • Use the event counter as your AB test objective

Test complex cases

Some variants might be complicated to create with your AB testing solution. For instance, comparing notice formats (banner vs pop-in) can be complex to configure.

In that case, you can use your AB testing solution to serve two different versions of the didomiConfig configuration object: one specific to your control group and another one specific to your test group. With that setup, you can create variations of any option supported by the Didomi SDK without being limited by your testing solution.

Measure and analyze your test results

You need to define and measure the metric(s) that you will use to analyze your tests results and compare the performance of your test groups. Your AB testing solution should allow you to define what metric(s) you want to use (number of clicks on the "I agree" button, number of page views with consent, etc.) and measure them for you, potentially with the help of your analytics solution.

If that is not possible or if you want to be able to have access to all the analytics provided by Didomi for your AB tests, we can setup a custom dashboard to help you track the results of your tests. Reach out to support@didomi.io for more information.