Configure a batch export

Tutorial for configuring a batch export destination and configuration.

Batch export is a premium feature that is not enabled for organizations by default. Make sure you reach out to [email protected] to enable this integration before configuring one for your organization.

Introduction

Batch exports consist of two main components:

  • Export configurations define the type of data to export as well as the frequency for delivering the data.

  • Export destinations define the cloud storage (bucket) that will receive the data and the credentials to connect to them.

You can configure an export configuration to use multiple destinations and re-use an export destination in more than one export configuration.

In order to set up a batch export for your organization you need to:

  • Configure an export destination to indicate where your data should be sent.

  • Configure an export configuration to schedule the delivery of your data.

Configure an export destination

To configure an export destination, send a POST /exports/destinations request with the name of the destination to create, a description, and the ID of the organization that will host it.

AWS S3

For AWS S3 Bucket (owned by Didomi) destinations, you need to specify your AWS account ID in the destination config:

POST https://api.didomi.io/v1/exports/destinations
BODY
{
"organization_id": "didomi",
"name": "S3 destination",
"description": "Didomi AWS S3 destination",
"type_id": "aws-s3-didomi",
"config": {
"account_id": "0000000000"
}
}

The API will return something similar to:

{
"id": "58g9gnu3b5",
"organization_id": "didomi",
"name": "S3 destination",
"description": "Didomi AWS S3 destination",
"type_id": "aws-s3-didomi",
"config": {
"account_id": "0000000000",
"bucket_name": "my-didomi-bucket",
"bucket_access_role_arn": "arn:aws:0000000000:iam:role/my-didomi-bucket-role"
}
}

Write down the id of the export destination (58g9gnu3b5 in our example) as you will need it in the following step.

The destination's bucket_name is the name of the created AWS S3 bucket where your Didomi data will be delivered. Assuming the AWS IAM role returned as bucket_access_role_arn is required to get the correct credentials to read the created AWS S3 bucket.

GCP storage

If you are configuring a GCP Storage Bucket destination instead, you must set the type to gcp-storage and specify the GCP bucket name, your project ID, and the JSON key in the config payload:

POST https://api.didomi.io/v1/exports/destinations
BODY
{
"organization_id": "didomi",
"name": "S3 destination",
"description": "Didomi AWS S3 destination",
"type_id": "gcp-storage",
"config": {
"project_id": "didomi-cmp",
"bucket_name": "didomi-cmp-bucket",
"key": "{\"type\":\"service_account\",\"project_id\":\"didomi-cmp\"..."
}
}

Attach your destination to an export configuration

To create an export configuration, send a POST /exports/configs request with the name of the configuration to create, a description, the ID of the organization that will host it, and the ID of the destination that you just created. The data field specifies the type of data to be exported.

The current version of the Didomi export configurations only support daily incremental exports (1 day frequency).

POST https://api.didomi.io/v1/exports/configs
BODY
{
"organization_id": "didomi",
"name": "Didomi export",
"description": "My daily export for Didomi",
"data": {
"users": { "enabled": true },
"proofs": { "enabled": false }
},
"destinations_id": ["58g9gnu3b5"]
}

This will schedule a daily batch export that will deliver the delta updates of the organization's consent data to the configured destination. The first batch will be delivered 24 hours after the creation date of the export configuration.

For more information on batch exports, please visit the Batch export documentation.