In order to strengthen legal certainty regarding the possibility to use automated means for purposes of content moderation, clarity should be provided as to the conditions under which exemptions from liability apply where such means are used by providers of hosting services to examine and process information provided by recipients of their service, including where such use could theoretically be considered to lead to the provider acquiring knowledge or awareness of certain illegal content, or control over information provided by the recipient of the service. That includes, for example, the use of automated means for detecting, identifying, removing or blocking the dissemination of certain types of illegal content, or for removing, disabling or restricting the visibility of content that is incompatible with their terms and conditions, provided that such automated means and their application is limited to what is strictly necessary to achieve the objectives pursued. Automated means include various forms of automated verification and sorting mechanisms. Where the actual use of automated means leads to the provider having awareness or knowledge of illegal content, or where the provider acquires control over information provided by the recipient of the service, such that the role of the provider is no longer merely technical, automatic and passive, then the exemption from liability should no longer apply to the provider concerned, for the content for which the provider has knowledge and control.
Recital 40