Skip to main content

Republican Midterm Agenda: Section 230, Censorship, and Big Tech

A significant Big Tech overhaul is high on the Republican agenda for 2023. As the midterm elections loom, House Republicans are rolling out their “Commitment to America” agenda that is a four-part blueprint for the GOP’s policy vision and legislative agenda for the new Congress should they take back the majority. Republicans have had eyes on Section 230 reform for years claiming social media companies overly flag, deplatform, and discriminate against conservative viewpoints— what former President Trump claimed was “selective censorship.” As Republicans plan to take back the House, let’s look at how their forthcoming agenda compares to the Biden Administration’s tech regulation goals and if there is overlap for bipartisan compromise in 2023.

Understanding the Section 230 Debate

The overall debate surrounding Section 230 is embedded with interconnected issues of social media liability, free expression, and content moderation. Section 230 is part of the federal 1996 Communications Decency Act that states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” (47 U.S.C. § 230). Simply put, Section 230 says social media platforms cannot be held legally responsible for the content users post. More than 25 years after its enactment, legislators argue it’s time to reexamine and revise those protections for the tech industry, which has found itself in a string of controversies involving privacy concerns, antitrust investigations, the radicalization of extremists, accusations of liberal bias, and the societal consequences of digital harm on children. Indeed, over the last few years, the CEOs of a handful of tech companies have testified repeatedly in congressional hearings. For example, recently, the Senate Homeland Security Committee held a hearing on Social Media’s Impact on Homeland Security where lawmakers across the aisle pressed top executives from Meta, Twitter, YouTube, and TikTok about safety, privacy, and moderation concerns. It’s clear that Congress wants to reign in “Big Tech.”

Although Section 230 has come under fire from both sides of the aisle, the issue is divided along party lines between pro-moderation versus anti-censorship arguments. Democratic lawmakers argue that Section 230 encourages the spread of harmful content while technology companies deflect accountability. On the other side, Republican lawmakers say that Section 230 allows these companies to violate free speech by unfairly censoring conservative viewpoints such as deplatforming a sitting Republican President. Nevertheless, for all the talk about repealing or amending Section 230, legislation affecting social media platforms has proved very difficult to pass. The tech industry advocates the continuing importance of the law arguing that Section 230 facilitates innovation and allows companies to compete. Industry experts are concerned that increased regulation could have unintended or harmful tradeoffs involving privacy protections, cybersecurity risks, First Amendment rights, or stifling American tech entrepreneurship.

On October 3, 2022, the Supreme Court announced that it will hear for the first time two cases involving Section 230 and content moderation. The Supreme Court has never interpreted the limitations and exceptions of the law; however, Justice Clarence Thomas has previously expressed skepticism toward Section 230. The cases question whether federal law allows tech companies like Google to be sued for algorithmically recommended content. The rulings will have significant implications such as opening all kinds of new doors for future litigation, prompting an influx of legislation by state and federal legislators, and fundamentally changing the future of the internet.

The House Republicans’ Big Tech Accountability Platform

On April 15, 2021, as part of the Big Tech Accountability Platform, Republican staff on the Energy and Commerce Committee released a memo outlining proposals to guide the Republican legislative agenda. The legislative roadmap was later used to develop a comprehensive package of discussion draft bills that sought to hold the largest tech companies accountable. The legislative concepts consisted of:

  1. Targeting New Obligations to Big Tech Companies
  2. Limiting the Right to Exclusion
  3. Requiring Reasonable Moderation Practices
  4. Limiting Liability to Protected Speech
  5. Removing Liability Protections
  6. Requiring Appeals Processes
  7. Carving Out Big Tech Companies from Section 230
  8. Reauthorization of Section 230
  9. Content Policies, Processes for Content Decisions, and Appeals Processes
  10. Expanding the Child Online Privacy Protection Act (COPPA)
  11. Addressing Children’s Mental Health
  12. Law Enforcement Collaboration

The Biden Administration’s Technology Policy Platform

On September 08, 2022, Senior White House officials held a roundtable listening session with experts to discuss the real-world harms caused by social media. The roundtable led to the most detailed recommendations the Biden Administration has released to date. The White House announced six core principles for reform:

  1. Promote competition in the technology sector
  2. Provide robust federal protections for Americans’ privacy
  3. Protect our kids by putting in place even stronger privacy and online protections for them, including prioritizing safety by design standards and practices for online platforms, products, and services
  4. Remove special legal protections for large tech platforms
  5. Increase transparency about platform’s algorithms and content moderation decisions
  6. Stop discriminatory algorithmic decision-making

Room for Bipartisan Agreement

Online Platform Transparency

While Republicans and Democrats do not agree on Section 230, there have been bipartisan calls for greater platform transparency. One of the main principles of the Republican’s Big Tech Censorship and Data Task Force is a transparency framework, which seeks to end “Big Tech’s ability to hide behind vague terms of service.”1 Currently under Section 230, tech platforms police themselves and are not legally required to disclose how they enforce their terms of use. Legislators are hoping that mandating clear information sharing about platforms’ internal processes and content moderation policies will help society understand what is happening on these platforms amid heightened scrutiny over how social media collects data, curates algorithms, promotes or removes content, and its impact. For example, users frequently express confusion over when and why a moderation action or account suspension decision was made. Although the 117th Congress is nearing its final days, examples of bipartisan bills addressing transparency include the Platform Accountability and Transparency Act, Filter Bubble Transparency Act, and the Social Media NUDGE Act. Whether it’s requiring algorithmic disclosure, audit reports, data sharing, or appeals processes, policymakers are requesting meaningful publicly available information from these platforms which can then be studied among academia, civil society researchers, the media, and the public at large.

Child Online Privacy Protection

Another notable area of consensus between Republicans and Democrats is the need to protect children online. In his first State of the Union address, President Biden stated, “It’s time to strengthen privacy protections; ban targeted advertising to children; demand tech companies stop collecting personal data on our children.”2 Policymakers hope to strengthen privacy protections for children and teens’, who are especially vulnerable to harmful content or deceptive targeted advertisements. However, a compromise solution to creating standards and regulations around children’s online privacy could prove difficult. Two bipartisan privacy bills, Kids Online Safety Act and Children and Teens’ Online Privacy Act, advanced through the U.S. Senate Commerce Committee, but did not receive a vote in the full Senate. Building off and updating existing laws, such as COPPA may be a good first step towards improving children’s safety online. Recently, the Republican’s Big Tech Censorship and Data Task Force made a specific proposal to raise the age covered under existing law, to account for harms experienced not only by children, but also by teens online. Though calls for a comprehensive data privacy law have not been answered, there are still opportunities to improve privacy protections for our youth. We expect bipartisan efforts to continue this year and into 2023 to address children’s and teens’ safety online.

States Take Action

Many U.S. states are responding to failed federal regulations by taking matters into their own hands. Recently, California Governor Newsom passed two controversial bills, AB 587 and AB 2273, into laws involving content moderation regulation and privacy protections for children. Both laws face tremendous backlash from tech industry groups, who will most likely sue arguing that the laws violate the First Amendment. Similarly, high-profile laws in Texas and Florida broadly ban “viewpoint” censorship and deplatforming political candidates. After lengthy battles of back-and-forths between the tech industry and the courts, both laws are currently blocked while their cases are pending petitions from the Supreme Court. Last month the Texas law (HB20) was upheld as constitutional by the Fifth Circuit Court of Appeals, which conflicts with a decision by the Eleventh Circuit Court of Appeals which ruled Florida’s law (SB7072) to be unconstitutional. The contradictory rulings on Section 230 among the appeals courts will make it likelier for the Supreme Court to weigh in. Overall, a wave of U.S. states are moving to target social media platforms regarding content moderation, children’s privacy, and political censorship; however, many state lawmakers have specific partisan agendas. As a result, this momentum from state governments is gaining national attention and could influence federal policymaking.


Regardless of which party wins the midterms, 2023 offers a fresh start for Democrats, Republicans, and the White House to discuss Section 230 reforms. Both parties are demanding Big Tech accountability and ultimately aim to foster a safer online environment and protect common values. Transparency legislation and children’s privacy laws hold a lot of potential to bridge the partisan divide over tech regulation.

Read Next

Support Research Like This

With your support, BPC can continue to fund important research like this by combining the best ideas from both parties to promote health, security, and opportunity for all Americans.

Give Now