Content Moderation
Our online information impacts our opinions, health, and political views. There is a delicate balance between protecting individuals’ online access while mitigating risks associated with harmful online content. BPC’s Technology Project works with a wide variety of public and private-sector stakeholders from across the political spectrum to develop and advance bipartisan policy approaches to content moderation – the removal of harmful user content from online platforms.
Share
Read Next
Explore Our Work:
Section 230- Are Online Platforms Publishers, Distributors, or Neither?
Bipartisan Policy Center Files Amicus Brief to the U.S. Supreme Court in Support of Respondent Google LLC
Bipartisan Policy Center’s Amicus Brief on Gonzalez v. Google
Summarizing the Amicus Briefs Arguments in Gonzalez v. Google LLC
Gonzalez v. Google: Implications for the Internet’s Future
Coordinated Influence Operations— Fear, Uncertainty and Doubt
Republican Midterm Agenda: Section 230, Censorship, and Big Tech
The effects of platform openness on app stores and other digital platforms
Summarizing the Section 230 Debate: Pro-Content Moderation vs Anti-Censorship
Online content regulation beyond the big platforms
Tech Regulation Through Transparency
Implications for Changing Section 230
Content Moderation Around the World
Content Moderation (IML) in the European Union
The Future of Intermediary Liability and Content Moderation
Information Disorder
Support Research Like This
With your support, BPC can continue to fund important research like this by combining the best ideas from both parties to promote health, security, and opportunity for all Americans.
Donate NowRelated Articles
Join Our Mailing List
BPC drives principled and politically viable policy solutions through the power of rigorous analysis, painstaking negotiation, and aggressive advocacy.