EPD has joined an initiative led by Algorithm Watch and the European Policy Centre that calls for putting meaningful transparency at the heart of the Digital Services Act. The statement provides three main recommendations that together can help achieve greater transparency from internet platforms:
1. Binding rules outlining who can directly access data or can apply for access, what specific data can be accessed and how and by whom that data is to be gathered and checked before disclosure:
- Disclosure obligations should differentiate between dominant players and smaller intermediaries, as defined according to indexes of annual turnaround, market share, user base and/or gatekeeping impact. We propose that the scope of the recommendations be limited to dominant platforms.
- Disclosure obligations should be based on the technical functionalities of the platform service, rather than more ambiguous and
politically-charged conceptions of harm such as ‘disinformation’, ‘political advertising’, and ‘hate speech’. - Technical features might include: high-level aggregate audience metrics; advertising and micro-targeting; search features; feeds, ranking and recommendation; and content moderation (including removal but also other measures such as demonetisation or fact-checking).
2. An EU institution with a clear legal mandate to enable access to data and to enforce transparency obligations in case of non-compliance across the EU27:
- The institution should act as an arbiter in deciding on requests for confidentiality from the disclosing party(based on e.g. intellectual property or data protection law). Barriers to gaining access to predefined data should be minimized. The institution should maintain relevant access infrastructures such as virtual secure operating environments, public databases, websites and forums. It should also be tasked with pre-processing and periodically auditing disclosing parties to verify the accuracy of disclosures.
- Furthermore, the mandate shall comprise collaboration with multiple EU and national-level competent authorities such as data protection authorities, cyber-security agencies and media regulators to minimize the risk of capture or negligence. The legal framework should explicitly outline different levels of oversight and how they interact. Because trust in government bodies differs widely across Member States, installing tiered safeguards and guarantees for independence is critical. To prevent competence issues and minimize the politicization of the framework, it is advisable that the role of such an institution be limited to the role of a ‘transparency facilitator.’
- The institution shall proactively support relevant stakeholders. The freedom of scientific research must be explicitly enshrined. In this spirit, the proposed institution must also proactively facilitate uptake, tools and know-how among stakeholders including journalists, regulators, academics, and civil society. The institution might also explore the possibility of engaging the broader European public in the development of researchagendas (see e.g. lessons from the Dutch National Research Agenda) or by incubating pilot projects that explore the possibility of connecting users and researchers through fiduciary models. Independent centers of expertise on AI/ADM at national level, as proposed by AlgorithmWatch and Access Now, could play a key role in this regard and support building the capacity of existing regulators, government and industry bodies.
3. Provisions that ensure data collection is privacy-respecting and GDPR compliant:
- Because of the sensitive nature of certain types of data, there are legitimate concerns to be raised regarding threats to user privacy. The Cambridge Analytica scandal should serve as a cautionary tale, and any misuse of data by researchers would severely undermine the integrity of any transparency framework.
- It is imperative that the institution uphold the GDPR’s data protection principles including (a) lawfulness, fairness and transparency; (b)purpose limitation; (c) data minimization; (d) accuracy; (e) storage limitation and (f) integrity
and confidentiality. - The proposed data access entity should take inspiration from existing institutions like the Finnish health data framework Findata which integrates necessary safeguards (both technical and procedural) for data subjects, including online rights management systems that allow citizens to exercise their data subject rights in an easy manner.
- Granular data access should only be enabled within a closed virtual environment, controlled by the independent body. As was the case with the Findata framework, it is advisable for the Commission to consider testing key components of the framework in pilot phases
You can read the full statement here and access more resources on why full transparency is necessary on Algorithm Watch website.