The Didomi SDK allows you to run AB tests to compare the consent rates and other metrics of two groups of populations. We support running AB tests through your preferred vendor or through the Didomi platform.
The Didomi SDK allows you to test the performance of a variant of your configuration (notice format, content, etc.). To do so, our SDK will separate your users into two groups:
Control group (A): control users will be shown your normal consent notices and the current configuration of your SDK will be applied.
Test group (B): test users will be shown the variant of your SDK configuration that you want to test.
To validate whether the outcome of your AB test is positive or negative, you will observe the metrics (consent rate, bounce rate, etc.) of your test group and compare them with the metrics of your control group. If your test group performs better than your control group then your AB test will be positive and indicate that the configuration applied to your test group is more efficient than the one applied to your control group.
To setup your AB test, you need to add an
experiment property in your
didomiConfig variable. Keep in mind that all properties outside of that
experiment property will be applied to your control group as usual. That is the baseline that you will be comparing your test group results against.
experiment property allows you to configure three variables of your test:
A unique ID for your test. That will be used to distinguish the results between multiple tests that you would be running over time. Use a lowercase string with no spaces and only
The size of your test group as a percentage between 0 and 1. Your control group size will be
The extra configuration parameters to apply to the test group users. The configuration parameters will be merged into your regular configuration (the one applied to your control group) to allow you to only test a small number of variables at once. Any configuration parameter that is valid for the SDK can be used here.
Start date of the test as an ISO 8601 string. When provided, only users that are asked for consent after that start date will be included in the test.
While this is not required, it is strongly recommended to include a start date to make sure that your test and control groups only include new users that have not seen the consent UI yet.
Keep in mind that users in your test group will see the default configuration for your tag PLUS the specific configuration for them (that will replace the default configuration for properties that appear in both) For instance, they will see all the IAB vendors and the pop-in consent notice. Users in your control group will see all the IAB vendors and the banner consent notice.
We will provide a dashboard to analyze the results of your AB tests. Send an email to email@example.com to get us a dashboard and indicate your test ID (the one configured in
The dashboard will indicate the results for both your control group and your test group, allowing you to compare the performance of both groups to measure the impact of your variation.
If you already work with an AB testing solution, you can use it to deploy different variants of the Didomi tag configuration and automatically measure the difference in consent rates.
You create, configure and analyze your AB test entirely in your AB testing solution. You will create the variants that you want to test using the standard process of your vendor and will use your testing solution to collect and analyze the result of your tests.
Example for comparing two different messages in your consent notice:
Create an AB test with two groups: control and test
Configure the test group to show a different notice message
Add an event to count the number of clicks on the "I agree" button of the consent notice
Use the event counter as your AB test objective
Some variants might be complicated to create with your AB testing solution. For instance, comparing notice formats (banner vs pop-in) can be complex to configure.
In that case, you can use your AB testing solution to serve two different versions of the
didomiConfig configuration object: one specific to your control group and another one specific to your test group. With that setup, you can create variations of any option supported by the Didomi SDK without being limited by your testing solution.
You need to define and measure the metric(s) that you will use to analyze your tests results and compare the performance of your test groups. Your AB testing solution should allow you to define what metric(s) you want to use (number of clicks on the "I agree" button, number of page views with consent, etc.) and measure them for you, potentially with the help of your analytics solution.
If that is not possible or if you want to be able to have access to all the analytics provided by Didomi for your AB tests, we can setup a custom dashboard to help you track the results of your tests. Reach out to firstname.lastname@example.org for more information.