Source: European Parliament
The Digital Services Act[1] (DSA) requires the providers of Very Large Online Platforms (VLOPs) and of Very Large Online Search Engines (VLOSEs) to perform annual risk assessments of the systemic risks stemming from the functioning, design and use of their services.
The assessment must cover certain risks defined in the DSA, such as the dissemination of illegal content and any actual or foreseeable negative effects on civic discourse and electoral processes.
Moreover, the DSA requires providers of VLOPs and VLOSEs to put in place reasonable, proportionate and effective mitigation measures tailored to those risks.
The DSA leaves VLOPs and VLOSEs the choice of the mitigation measures they wish to use, as long as those measures are reasonable, proportionate, effective and tailored to the specific risks identified.
This evaluation standard applies to both centralised fact-checking and decentralised initiatives such as community notes.
When implementing such measures, the providers of VLOPs and VLOSEs are also required to consider their impacts on freedom of expression and other fundamental rights.
Adherence to the Code of Conduct on Disinformation[2] may constitute an appropriate risk mitigation measure.
Additionally, DSA provides recipients of the service with mechanisms to challenge content moderations decisions made by the providers of online platforms.
Whether community notes can be considered as content moderation decisions in this sense, depends on how the community notes system of the specific provider operates.
Relevant factors for this assessment are the involvement of the provider in the decision process and the potential restrictions that follow from the community note.
- [1] Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act).
- [2] https://digital-strategy.ec.europa.eu/en/library/code-conduct-disinformation