Section 230 of the American Communications Decency Act has freed big tech from accountability until today. Social media platforms, search engines and increasingly private messaging platforms and generative AI platforms have however moved beyond being a platform that provides a space for information to taking editorial decisions: their recommender and moderation algorithms define what gets visibility, their training data set defines what speech is allowed.
Citizens and news media are at the mercy of big tech, and their changing priorities, profit-driven nature, and in-parts political agenda to disseminate and access information.
Some jurisdictions, notably the EU and UK, have taken steps to install an accountability regime, yet implementation is still insufficient.
Similarly, researchers and NGOs are at the mercy of big tech to study their impact on society and democracy. Transparency about policies, algorithms and data access provisions have varied with changing political and economic interests.
This poses great dangers to society, as big tech govern a space which should be a public good in which transparency and accountability are key principles.
This Policy Manual highlights how the current digital information ecosystem — dominated by Big Tech platforms (very large social media and search engines, and increasingly also AI companies) — has become increasingly captured in ways that undermine media freedom. It underscores the need for democratic state intervention, based on the rule of law, to ensure an enabling environment for independent and pluralistic journalism.
The Manual offers a vision for healthy online information spaces, where the availability and accessibility of public interest information are ensured. It puts forward mitigation measures and key recommendations for States to implement long-term structural reforms and sustained investments to address the distortions in today’s online information ecosystem.
The recommended mitigation measures cover three key areas:
• Visibility of journalism and public interest information online
• Media viability and funding models that support public interest information
• Vigilance, or the online safety of journalists
The core of this Policy Manual lies in the guidance it provides on how to enable healthy information spaces online by freeing the ecosystem from heavily concentrated gatekeeping power, and instead fostering an enabling environment for media freedom in the algorithmic and artificial intelligence (AI) era.
It concludes that for media freedom to be safeguarded, addressing platform-related challenges alone is not sufficient. Instead, it calls for more ambitious structural reforms — to move beyond merely mitigating media dependency and towards building an independent, pluralistic online information and media landscape that can sustain democratic debate and societal resilience.
This publication is part of the project “Healthy Online Information Spaces – SAIFE Renewed”. It was produced in collaboration with the Forum on Information and Democracy.
In today’s digital ecosystem, platforms play a central role as intermediaries in the circulation of information. As part of that role, they implement content moderation systems that include visible and relatively well-known measures. These can range from removing posts, temporarily or permanently suspending accounts, and other sanctions that users are generally informed about. These decisions are usually accompanied by users’ access to appeals mechanisms —at least under the terms set by the companies themselves— and are framed as part of compliance with their community guidelines.
The purpose of this investigation is to examine the phenomenon of shadow banning that takes place on digital platforms, identifying specific cases and analyzing their impact on the visibility of media outlets, critical voices, and underrepresented sectors. Additionally, it seeks to assess how transparent platforms are regarding these practices, along with the consequences for democratic participation in online public spaces.
The Gesellschaft für Freiheitsrechte and the Forum on Information and Democracy hosted a workshop on 18 September 2025 to explore how the DSA data access provisions can be leveraged beyond the EU to provide transparency and accountability.
The OSCE Office of the Representative on Freedom of the Media (RFoM) and the Forum on Information and Democracy (FID) launched today a new policy manual…
On September 11, 2025, the Partnership for Information and Democracy held the third meeting of its workstream on “Strengthening Information Integrity on Private Messaging Platforms”, led…
In the framework of an ongoing consultation process on the Delegated Regulation on data access provided for in the Digital Services Act, the Forum and around…