Section 230 of the Communications Decency Act, passed in 1996 as part of the Telecommunications Act, has become a political lightning rod in recent years. The law shields online platforms from liability for user-generated content while allowing moderation in good faith.
Lawmakers including Sens. Lindsey Graham, R-S.C., and Dick Durbin, D-Ill., now seek to sunset Section 230 by 2027 in order to spur a renegotiation of its provisions. The senators are expected to hold a press event before April 11 about a bill to start a timer on reforming or replacing Section 230, according to reports. If no agreement is reached by the deadline Section 230 would cease to be law.
The debate over the law centers on balancing accountability for harmful content with the risks of censorship and stifled innovation. As a legal scholar, I see dramatic potential effects if Section 230 were to be repealed, with some platforms and websites blocking any potentially controversial content. Imagine Reddit with no critical comments or TikTok stripped of political satire.
The law that built the internet
Section 230, often described as “the 26 words that created the internet,” arose in response to a 1995 ruling penalizing platforms for moderating content. The key provision of the law, (c)(1), states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” This immunizes platforms such as Facebook and Yelp from liability for content posted by users.
Importantly, Section 230 does not offer blanket immunity. It does not shield platforms from liability related to federal criminal law, intellectual property infringement, sex trafficking or where platforms codevelop unlawful content. At the same time, Section 230 allows platform companies to moderate content as they see fit, letting them block harmful or offensive content that is permitted by the First Amendment.
Some critics argue that the algorithms social media platforms use to feed content to users are a form of content creation and should be outside the scope of Section 230 immunity. In addition, Federal Communications Commission Chairman Brendan Carr has signaled a more aggressive stance toward Big Tech, advocating for a rollback of Section 230’s protections to address what he perceives as biased content moderation and censorship.
Censorship and the moderation dilemma
Opponents warn that repealing Section 230 could lead to increased censorship, a flood of litigation and a chilling effect on innovation and free expression.
Section 230 grants complete immunity to platforms for third-party activities regardless of whether the challenged speech is unlawful, according to a February 2024 report from the Congressional Research Service. In contrast, immunity via the First Amendment requires an inquiry into whether the challenged speech is constitutionally protected.
Without immunity, platforms could be treated as publishers and held liable for defamatory, harmful or illegal content their users post. Platforms could adopt a more cautious approach, removing legally questionable material to avoid litigation. They could also block potentially controversial content, which could leave less space for voices of marginalized people.
MIT management professor Sinan Aral warned, “If you repeal Section 230, one of two things will happen. Either platforms will decide they don’t want to moderate anything, or platforms will moderate everything.” The overcautious approach, sometimes called “collateral censorship,” could lead platforms to remove a broader swath of speech, including lawful but controversial content, to protect against potential lawsuits. Yelp’s general counsel noted that without Section 230, platforms may feel forced to remove legitimate negative reviews, depriving users of critical information.
Corbin Barthold, a lawyer with the nonprofit advocacy organization TechFreedom, warned that some platforms might abandon content moderation to avoid liability for selective enforcement. This would result in more online spaces for misinformation and hate speech, he wrote. However, large platforms would likely not choose this route to avoid backlash from users and advertisers.
A legal minefield
Section 230(e) currently preempts most state laws that would hold platforms liable for user content. This preemption maintains a uniform legal standard at the federal level. Without it, the balance of power would shift, allowing states to regulate online platforms more aggressively.
Some states could pass laws imposing stricter content moderation standards, requiring platforms to remove certain types of content within defined time frames or mandating transparency in content moderation decisions. Conversely, some states may seek to limit moderation efforts to preserve free speech, creating conflicting obligations for platforms that operate nationally. Litigation outcomes could also become inconsistent as courts across different jurisdictions apply varying standards to determine platform liability.
The lack of uniformity would make it difficult for platforms to establish consistent content moderation practices, further complicating compliance efforts. The chilling effect on expression and innovation would be especially pronounced for new market entrants.
While major players such as Facebook and YouTube might be able to absorb the legal pressure, smaller competitors could be forced out of the market or rendered ineffective. Small or midsize businesses with a website could be targeted by frivolous lawsuits. The high cost of compliance could deter many from entering the market.
Reform without ruin
The nonprofit advocacy group Electronic Frontier Foundation warned, “The free and open internet as we know it couldn’t exist without Section 230.” The law has been instrumental in fostering the growth of the internet by enabling platforms to operate without the constant threat of lawsuits over user-generated content. Section 230 also lets platforms organize and tailor user-generated content.
The potential repeal of Section 230 would fundamentally alter this legal landscape, reshaping how platforms operate, increasing their exposure to litigation and redefining the relationship between the government and online intermediaries.

The post “What would happen if Section 230 went away? A legal expert explains the consequences of repealing ‘the law that built the internet’” by Daryl Lim, Professor of Law and Associate Dean for Research and Innovation, Penn State was published on 04/09/2025 by theconversation.com
Leave a Reply