Source: European Parliament
Question for written answer E-002018/2025
to the Commission
Rule 144
Sirpa Pietikäinen (PPE)
Currently, the detection of child sexual abuse material (CSAM) online is regulated by a temporary extension of the ePrivacy derogation[1] to allow the voluntary detection, by internet platforms, of such material.
Why has the Commission not proposed the mandatory use of AI-based tools on social media platforms to detect new CSAM, particularly in situations where end-to-end encryption is not an obstacle, despite research showing that AI is the only effective method for identifying new CSAM?
Submitted: 21.5.2025
- [1] Regulation (EU) 2021/1232 of the European Parliament and of the Council of 14 July 2021 on a temporary derogation from certain provisions of Directive 2002/58/EC as regards the use of technologies by providers of number-independent interpersonal communications services for the processing of personal and other data for the purpose of combating online child sexual abuse (OJ L 274, 30.7.2021, p. 41, ELI: http://data.europa.eu/eli/reg/2021/1232/oj).
Last updated: 28 May 2025