Bots (SEO & Performance tools)

In most cases, bots that visit your website for indexing it (search engines like Google, Bing, etc.) or for performance evaluation (Google Page Insights, GTmetrix, etc.) do not need to see the consent collection UI. It might even be detrimental to your SEO ranking or performance score to display a consent notice that bots cannot interact with and that they might end up indexing as being part of your website.

The Didomi Web SDK comes fully equipped with an option to control its behavior when a bot is visiting your website. You can decide to completely bypass the consent collection and act "as if" consent had been given (load all the third-party vendors, for instance) as bots are not humans and do not need to provide consent.

To indicate that consent should not be collected for bots, set the user.bots.consentRequired property to false in your SDK configuration:

<script type="text/javascript">
window.didomiConfig = {
  user: {
    bots: {
      /**
       * Indicate whether consent is required for bots
       * Defaults to true
       */
      consentRequired: false,
      
      /**
       * Predefined types of bots to identify
       * Defaults to all types supported by the SDK
       * (Optional)
       */
      types: ['crawlers', 'performance'],
      
      /**
       * List of additional regular expressions to match
       * against the User Agent of the visitor.
       * When one of the regular expressions matches, the user
       * is considered to be a bot.
       * This allows you to identify extra bots that the SDK does not
       * know about.
       * Regular expressions must be specified as strings and correctly
       * escaped.
       * (Optional)
       */
      extraUserAgents: [],
    }
  }
};
</script>

The user.bots.types and user.bots.extraUserAgents properties give you extra control on what user agents are identified as bots. The Didomi SDK can automatically identify all the most common search engine bots and performance tools.

Example using extraUserAgents

The SDK internally uses the RegExp object constructor to generate the needed regular expressions out of the provided extraUserAgents array list. Normal string escape rules apply, i.e, preceding special characters with the \ character, reference.

The following is a valid example using the extraUserAgents property:

<script type="text/javascript">
window.didomiConfig = {
  user: {
    bots: {
      consentRequired: false,
      types: ['crawlers', 'performance'],
      // * Regular expressions must be specified as strings and correctly escaped.
      extraUserAgents: [
        "Mozilla\\/5.0 \\(Windows NT 10.0; Win64; x64; rv:88.0\\) Gecko\\/20100101 Firefox\\/88.0 \\(compatible; MonetoringBot\\/2.1\\)",
        "Mozilla\\/5.0 \\(compatible; SiteAnalyzerBot\\/5.0; \\+https:\\/\\/www.site-analyzer.com\\/\\)"
      ],
    }
  }
};
</script>

Bots categories

By default and if you configure the SDK to not collect consent from bots, all bots are impacted. If you want to control what categories of bots require consent more specifically, the Didomi SDK categorizes bots in the following categories and allows you to enable only some of them:

IDCategoryDescription

crawlers

Crawlers

Bots that index your websites for Search Engines (Google, Bing, etc.)

performance

Performance Tools

Bots that visit your websites for performance reports (Google Page Insights, GTmetrix, etc.)

Last updated