Skip to main content

Section 230- Are Online Platforms Publishers, Distributors, or Neither?

“Well, does that constitute publishing?… I don’t know where you’re drawing the line.”— Justice Samuel Alito, Oral Arguments for GONZALEZ v. GOOGLE LLC, February 21, 2023

Section 230 of the Communications Decency Act and “content moderation”— the removal of harmful user content from online platforms – have become important policy debates in recent years. Should social media platforms be regulated like newspapers, bookstores, or phone companies? Each of these analogies compare different bodies of law, but online platforms do not fit neatly into traditional communication categories. Social media platforms may be neither a publisher nor a distributor. This post navigates these nuances because how courts and lawmakers address Section 230 could fundamentally change the future of the internet.

Overview

In the 1990s, at the dawn of the internet, a court in Cubby, Inc. v. CompuServe Inc. found that an online messaging board was a passive distributor of content rather than a publisher. A few years later in Stratton Oakmont, Inc. v. Prodigy Services Co., another court ruled differently—that an online bulletin board was liable as a publisher because their automatic software screening program constituted ‘editorial control.’ These conflicting rulings posed a unique predicament: were internet service providers (ISPs) considered publishers or distributors? This distinction was important because, historically, publishers and distributors faced different liability standards. Voilà—this led to Section 230’s creation.

Section 230(c)(1): states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Section 230’s legal protections were created to encourage the innovation of the internet by preventing an influx of lawsuits for user content. Fast forward: the internet has grown into a thriving place to learn, connect, and share ideas—a vibrant and competitive free market. Internet businesses, like Apple and Microsoft, have grown into some of the world’s biggest and most powerful companies.

Section 230 has been the target of increasing public scrutiny over the last few years. When originally enacted, Congress could not have fathomed the internet as we know it today, and there are concerns that the digital environment is a lawless land with no accountability or oversight. As policymakers, advocates, academics, and industry experts attempt to address the question of Section 230 reform, an accurate understanding of how liability laws work is required to understand this legal debate.

Differences between Publisher, Distributor, and Internet Service Provider Liability

Let’s dive deeper into 3 types of legal liability for content providers:

What About the First Amendment?

The Section 230 debate is often entangled with questions about a user’s right to free speech online. There seems to be a broad misunderstanding about what is and isn’t protected by the First Amendment. The First Amendment prohibits the government from interfering with free speech, but it also protects the rights of private businesses to exercise editorial discretion. Both the Constitution and Section 230 protect social media companies to moderate content how they see fit, and the government cannot force them to suppress or remove protected speech.

Given the internet’s vital role in promoting free expression and advancing the free flow of information, some question whether internet platform companies should be considered essential public goods. Historically, public transportation services and telecommunications companies, like phone networks, have been considered “common carriers.” Under common law, common carriers are government-controlled spaces that transmit a vast range of information, people, or goods—in which they have zero control over the content disseminated. Common carriers are subject to nondiscrimination requirements, or “must-carry” obligations. If online platforms were regulated as common carriers, then companies could not engage in content moderation regardless of whether content violates the platforms’ terms of service. Such regulations have found support from Justice Clarence Thomas and in some U.S. states.

Recent high-profile laws in Texas and Florida broadly ban political censorship. Tech industry lobbying groups argue that both laws are unconstitutional because they violate the First Amendment right of private companies to make their own publishing decisions. If social media companies were prohibited from moderating any speech, this could have far-reaching implications, such as an influx of dangerous and graphic content. After lengthy battles between the tech industry and the courts, both laws are currently pending review by the Supreme Court.

The Supreme Court Takes on Section 230:

Gonzalez v. Google involves YouTube’s use of targeted recommendations. The platform’s algorithms allegedly amplified terrorist content and radicalized ISIS sympathizers into carrying out the 2015 coordinated terrorist attacks around Paris, which killed Nohemi Gonzalez among many others. At the heart of this case is the question of whether algorithms actively create/develop/generate content or neutrally organize/filter/distribute content?

Targeted recommendations are essential to how we receive information online. Historically, courts have interpreted the scope of Section 230 immunity very broadly. Most courts have consistently held that tech companies will not lose their immunity under Section 230 for exercising traditional publication activities, such as displaying, removing, or sorting third-party content. However, whether algorithmic amplification counts as a traditional editorial function is up for debate.

The ruling could have serious unintended consequences if the case leads to any reinterpretation of Section 230. If the scope of Section 230 is narrowed, ISPs could be held to similar liability standards as publishers, in which companies would likely engage in greater censorship to ensure every single piece of online content is not violating the law. If the scope of Section 230 is broadened, ISPs may be incentivized to avoid distributor liability through a “hands-off” or under-moderation approach, leading to the proliferation of dangerous material. Ultimately, users could expect significant changes to their experience online.

Next Steps

BPC and other experts have long argued that Section 230 reform should be addressed legislatively through Congress rather than in court. While bipartisan consensus around Section 230 seems unlikely, political gridlock may shake loose when the Gonzalez v. Google case draws national attention.

Whether change comes from Congress or the Supreme Court, Section 230 faces a defining moment. Understanding legal distinctions between publishers, distributors, and internet service providers is a crucial initial step when considering potential reforms to Section 230.

Share
Read Next

Support Research Like This

With your support, BPC can continue to fund important research like this by combining the best ideas from both parties to promote health, security, and opportunity for all Americans.

Give Now
Tags