Bots (SEO & Performance tools)
In most cases, bots that visit your website for indexing it (search engines like Google, Bing, etc.) or for performance evaluation (Google Page Insights, GTmetrix, etc.) do not need to see the consent collection UI. It might even be detrimental to your SEO ranking or performance score to display a consent notice that bots cannot interact with and that they might end up indexing as being part of your website.
The Didomi Web SDK comes fully equipped with an option to control its behavior when a bot is visiting your website. You can decide to completely bypass the consent collection and act "as if" consent had been given (load all the third-party vendors, for instance) as bots are not humans and do not need to provide consent.
Bypass consent collection for bots
To indicate that consent should not be collected for bots, set the user.bots.consentRequired
property to false
in your SDK configuration:
The user.bots.types
and user.bots.extraUserAgents
properties give you extra control on what user agents are identified as bots. The Didomi SDK can automatically identify all the most common search engine bots and performance tools.
Example using extraUserAgents
The SDK internally uses the RegExp
object constructor to generate the needed regular expressions out of the provided extraUserAgents
array list. Normal string escape rules apply, i.e, preceding special characters with the \ character, reference.
The following is a valid example using the extraUserAgents
property:
Bots categories
By default and if you configure the SDK to not collect consent from bots, all bots are impacted. If you want to control what categories of bots require consent more specifically, the Didomi SDK categorizes bots in the following categories and allows you to enable only some of them:
crawlers
Crawlers
Bots that index your websites for Search Engines (Google, Bing, etc.)
performance
Performance Tools
Bots that visit your websites for performance reports (Google Page Insights, GTmetrix, etc.)
Last updated