Transparency and accountability of Big Tech

Opaque algorithms and data monetization govern the global information space. The lack of transparency and accountability of big tech represent a major democratic deficit.

The liability problem

Section 230 of the American Communications Decency Act has freed big tech from accountability until today. Social media platforms, search engines and increasingly private messaging platforms and generative AI platforms have however moved beyond being a platform that provides a space for information to taking editorial decisions: their recommender and moderation algorithms define what gets visibility, their training data set defines what speech is allowed. 

Citizens and news media are at the mercy of big tech, and their changing priorities, profit-driven nature, and in-parts political agenda to disseminate and access information. 

Some jurisdictions, notably the EU and UK, have taken steps to install an accountability regime, yet implementation is still insufficient. 

Similarly, researchers and NGOs are at the mercy of big tech to study their impact on society and democracy. Transparency about policies, algorithms and data access provisions have varied with changing political and economic interests. 

This poses great dangers to society, as big tech govern a space which should be a public good in which transparency and accountability are key principles.

 

Our work on this theme

Policy brief

OSCE Policy Manual|Safeguarding media freedom in the age of Big Tech Platforms and AI – in partnership with the Forum on Information and Democracy

This Policy Manual highlights how the current digital information ecosystem — dominated by Big Tech platforms (very large social media and search engines, and increasingly also AI companies) — has become increasingly captured in ways that undermine media freedom. It underscores the need for democratic state intervention, based on the rule of law, to ensure an enabling environment for independent and pluralistic journalism.

The Manual offers a vision for healthy online information spaces, where the availability and accessibility of public interest information are ensured. It puts forward mitigation measures and key recommendations for States to implement long-term structural reforms and sustained investments to address the distortions in today’s online information ecosystem.

The recommended mitigation measures cover three key areas:
• Visibility of journalism and public interest information online
• Media viability and funding models that support public interest information
• Vigilance, or the online safety of journalists

The core of this Policy Manual lies in the guidance it provides on how to enable healthy information spaces online by freeing the ecosystem from heavily concentrated gatekeeping power, and instead fostering an enabling environment for media freedom in the algorithmic and artificial intelligence (AI) era.

It concludes that for media freedom to be safeguarded, addressing platform-related challenges alone is not sufficient. Instead, it calls for more ambitious structural reforms — to move beyond merely mitigating media dependency and towards building an independent, pluralistic online information and media landscape that can sustain democratic debate and societal resilience.

This publication is part of the project “Healthy Online Information Spaces – SAIFE Renewed”. It was produced in collaboration with the Forum on Information and Democracy.

Report

OBSERVACOM|”SHADOW BANNING”: The Subtle and Covert Censorship of the Major Tech Platforms – supported by the Forum on Information and Democracy and Digital Action

In today’s digital ecosystem, platforms play a central role as intermediaries in the circulation of information. As part of that role, they implement content moderation systems that include visible and relatively well-known measures. These can range from removing posts, temporarily or permanently suspending accounts, and other sanctions that users are generally informed about. These decisions are usually accompanied by users’ access to appeals mechanisms —at least under the terms set by the companies themselves— and are framed as part of compliance with their community guidelines.

The purpose of this investigation is to examine the phenomenon of shadow banning that takes place on digital platforms, identifying specific cases and analyzing their impact on the visibility of media outlets, critical voices, and underrepresented sectors. Additionally, it seeks to assess how transparent platforms are regarding these practices, along with the consequences for democratic participation in online public spaces.

Insight

Data access beyond the EU: exploring the possibilities of the DSA Article 40

The Gesellschaft für Freiheitsrechte and the Forum on Information and Democracy hosted a workshop on 18 September 2025 to explore how the DSA data access provisions can be leveraged beyond the EU to provide transparency and accountability.

 

Latest news on the same theme

Promoting availability and accessibility of public interest journalism focus of new OSCE RFoM and Forum on Information and Democracy policy manual

The OSCE Office of the Representative on Freedom of the Media (RFoM) and the Forum on Information and Democracy (FID) launched today a new policy manual…

Published on October 22, 2025

Regulatory approaches and safeguarding encryption – latest insights from policy discussions on private messaging platforms

On September 11, 2025, the Partnership for Information and Democracy held the third meeting of its workstream on “Strengthening Information Integrity on Private Messaging Platforms”, led…

Published on September 11, 2025

Time for a new approach: A new policy brief calls for digital taxes to fund journalism

At the M20 Summit, an independent initiative focused on integrating media and information integrity into the G20 policy agenda, the Forum on Information and Democracy is…

Published on August 28, 2025

The Forum, alongside 50 NGOs and researchers, urges the European Commission to prioritize the public interest in shaping data access provisions

In the framework of an ongoing consultation process on the Delegated Regulation on data access provided for in the Digital Services Act, the Forum and around…

Published on December 4, 2024