The Digital Services Act: Weak Democratic Safeguards on Big Tech
Together with the Civil Liberties Union for Europe (Liberties), EPD analysed the progress of the second set of DSA risk assessment and mitigation reports that came out in November 2025. Following our recommendations for improvement after the first set of reports was published in 2024, this paper looks at the progress made in the newly published second set. Comparing both sets of reports allows us to evaluate whether regulation and civil society feedback influenced these developments. However, we found no meaningful evolution in how online platforms and search engines describe and address threats to civic discourse and electoral processes.
Our key findings:
- Platforms largely do not reflect on the risks caused by their own structure. Instead, they continue to portray systemic risks as caused by external actors. According to the reports, threats to civic discourse and electoral integrity on platforms would therefore be caused by disinformation, foreign interference, etc. This suggests that the functioning of the platforms themselves is not problematic, but lays the blame solely on the behaviour of bad actors who misuse them.
- Civic discourse looks relevant only when connected to elections. This limits the true meaning of civic discourse, a pillar that upholds the functioning of our democracies, due to its role in protecting minority voices, facilitating pluralism, etc.
- Verifying the effectiveness of the measures taken by platforms to minimise risks remains difficult for external actors like EPD. Indicators that could be used to do this have still not yet been made available by operators. Particularly, the functioning of platforms’ recommender systems (algorithms) is very hard to assess for outside actors.
- Platforms remain very opaque over who they, internally and externally, consult for improvements. Stakeholders who have been engaged are not mentioned, and how or whether the outcome of this engagement affects measures taken is not stated.
What are these reports under the DSA?
The EU’s Digital Services Act (DSA), requires Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) to identify and mitigate risks to democracy caused by their services, such as threats to civic discourse and electoral processes. Large tech companies are then required to publish their risk assessment findings and take measures to mitigate risks in the future. Since 2024, reports on risks and mitigation strategies have been published annually.
Why are they so important?
The idea behind making these reports public is that it places the responsibility on tech companies to be accountable and transparent about their own design and the risk it poses to users. Moreover, it is supposed to provide information to civil society actors on how companies safeguard their platforms and dedicate their services towards this aim.
Cover photo: © cristianstorto on Adobe Stock.