Section 230, Free Speech, and Moderation: A Plain‑English Guide
I’ve heard several friends call social media “the new town square” — a big public space where anyone can say anything. Is it, though? Facebook, X/Twitter, YouTube, and TikTok aren’t public parks that a city runs. They’re privately owned spaces run to pay dividends to their shareholders. That means the people who own and operate these platforms get to write the rules. They decide what behavior to allow, and kick people out who break those rules. Section 230 doesn’t turn them into government-run town squares. It’s the legal rule that lets them host billions of posts from all of us without being dragged into court for everything we say.
Remembering that these are private spaces, a lot of the shouting about censorship and free speech starts to sound different. The First Amendment limits what the government can do to your speech. It does not specify what a private company can do in its own living room. Section 230 sits on top of that. It’s the rule that says, in most cases, online services aren’t legally treated as the speaker of what you post; you are. In the rest of this post, I’m going to translate that rule into plain English and show how courts have actually applied it. Then I’ll get to what some politicians want to change about it.
The One-Sentence Version
A simple definition of Section 230 is that it’s a 1996 law that generally says websites and apps aren’t legally responsible for what their users post — the users are. The law applies to everything from social media and forums to comment sections and marketplaces. If someone misrepresents something on Facebook Marketplace, Facebook is not responsible for that. If someone lies about you in an online forum, that forum is not responsible for that.
The Actual Language, De-Jargoned
“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” is the critical portion. A “provider or user” can be a website, an app, an Internet service provider (ISP), or even people who share posts. An “interactive computer service” is any service that lets multiple users access a server. That includes social networks and forums. An “information content provider” is the person who actually writes or uploads the content.
What Section 230 Does in Practice
You see material online all the time that you agree with, disagree with, or partially agree with. You see information that is useful, frivolous, true, and completely fabricated. That’s all part of the protections of Section 230. If you’re running for office in a town and your opponent posts something defamatory about you on a social network, you can’t sue the network. You need to sue the opponent. Platforms and users can often share, host, or leave up content without automatically becoming legally responsible for it. Now, if you pass something along that is not true, you may be responsible for that. The platform is not.
Why Congress Created It
Section 230 is the brainchild of Republican Chris Cox and Democrat Ron Wyden. It’s a rare bipartisan effort to serve a dual purpose of protecting and encouraging the growth and usefulness of the emerging Internet, while holding the right people accountable for published content. It’s part of the 1996 Communications Decency Act, which, among other things, addressed the expanded availability of pornography on the emerging web. Wyden and Cox wanted to ensure that, while we protect minors from content intended for adults, we don’t force platform owners to become content police. Their concern was that making platform owners responsible for the content published on their platforms would result in over-moderation. This would bring us an Internet that was “safe,” but also very boring, and essentially useless. Platform owners, fearing backlash in the form of lawsuits, would be pushed to scrub their content of anything remotely offensive, controversial, or not of universal interest or acceptance. Section 230 helps to reduce the chilling effects of potential lawsuits on online speech.
The Two Big Pillars of Section 230
When we think of content moderation on our favorite platform, we generally consider two things: removing content, and not removing it. It’s important to know that both actions are protected.
Pillar 1
Pillar 1 (230(c)(1)) offers protection when a platform doesn’t remove third-party content. The platform is not treated as the publisher of the content that people post, the way a magazine or newspaper is. You are one party, and the platform is another party. Everyone else that is not you and not the platform is a “third party.” For example, Floors Done Right is a website. Jenny, Joe, John, or Jane posting on a forum on Floors Done Right that the correct color for floors is green is strictly Jenny, Joe, John, or Jane. This position is not stated by Floors Done Right or the owners of the website. If you believe that the correct color for floors is blue, you don’t get to sue Floors Done Right for posting incorrect information in the forum. Floors Done Right is responsible for its own content in its informational sections, but not for what people post in the forum.
Pillar 2
Pillar 2 (230(c)(2)) grants protection when a platform does decide to remove or restrict content in good faith. This covers spam, hate speech, nudity, and the like, but it also covers things that the platform owner may simply consider offensive. So yes, that platform that removed your post about a politician was well within its rights, even if you don’t think you violated any published terms of service or community standards.
Many people would like to see one or the other pillar strengthened or weakened. They don’t understand the key idea. Wyden and Cox weren’t trying to make it easier or more difficult for platforms to promote or stifle either Republican or Democrat opinions. They were trying to ensure that platforms could act independently without having to walk the line between “no rules whatsoever” and having to face constant lawsuits. They wanted to protect the Internet and its users, not political interests. Codifying the rights of platform owners to encourage speech on a medium that was going to become increasingly important, regardless of the party in power, helped make that possible.
What Section 230 Does Not Do
It may appear at first blush that Section 230 widens even further the concept of a lawless Wild West. However, it doesn’t protect platforms from federal criminal law, intellectual property claims, or certain sex-trafficking laws. That’s different from certain civil-suit protections that Section 230 does offer. The important thing is that a platform is still responsible for obeying all laws applicable to it.
Section 230 also does not remove all accountability whatsoever. Speakers are accountable for their speech on any platform. Any person who posts harmful or illegal content can be sued civilly or prosecuted criminally. The distinction is that the responsibility for the content itself lies with the person who created it and put it up there. The user becomes the “publisher” of the content.
Here’s an example that takes us out of social media, because social media is really a very small part of the Internet. You’re reading this on my blog. My blog is a WordPress blog, hosted on Bluehost. Neither WordPress nor Bluehost can be sued civilly for anything I write, nor for your comments on my posts. I am the publisher. However, if I were to post illegal material, Bluehost could be criminally prosecuted if they didn’t take action after they knew about it. Deliberately looking the other way would be their crime, not the act of selling me hosting services. Section 230 would protect Bluehost against civil suits. It would not protect me as the content publisher from either criminal prosecution or civil suits.
How Section 230 Affects Your Everyday Internet Use
You probably don’t realize how much you benefit from Section 230. It’s essentially what makes today’s Internet such a useful tool. Section 230 is the reason you can post reviews, comments, and social posts without every single site needing a huge legal team to pre-approve each word. It’s also why sites can have community guidelines, block trolls, and remove harmful content without automatically admitting legal responsibility. The reason some groups on Facebook approve you before letting you in, and why they can kick you out if you misbehave points back to Section 230. It’s why there are some Reddit communities that you probably don’t want to venture into, but others that have some truly great information. It’s also why you can find user-generated information about products that help you make informed purchases. Without the Section 230 protections, the people who run these sites might not feel comfortable offering them.
The Big Fights About Changing It
Politicians from both parties criticize Section 230, but often for opposite reasons. Some cite either too much moderation, others cite not enough of it. Some reform suggestions include making platforms more liable for certain harms. They want to condition the law’s protection mechanisms on certain moderation practices. The concern around these proposals is that the changes could impact not just huge tech giants like Facebook and Twitter/X, but one-person shops like this blog. We also need to be aware of the Law of Unintended Consequences. Here’s what I mean.
Some reform bills say that platforms only deserve Section 230 protection if they follow “best practices” for finding harmful content like child abuse material. For any encrypted services, the only way to find that content at scale would be to scan messages or build in backdoors, because end-to-end encryption means that only the sender and receiver can read the messages.
In this situation, we have a trade-off. We can keep strong encryption (the backbone of privacy that makes transactions secure) and risk losing Section 230 protection (and face more lawsuits). Or we can weaken encryption so you can scan everything and keep the legal shield. This is a truly bad position to put anyone in. Civil liberties and technical experts warn that this would place pressure — quietly, but definitely — on companies to weaken or drop strong encryption. That would harm security and privacy for everyone, not just the bad guys. This is why many security experts say that changing Section 230 the wrong way could endanger encrypted traffic protections, even if the law never mentions encryption by name. I’m all in favor of catching sex traffickers and child pornographers. However, there is no way to weaken encryption just for law enforcement. Weakening encryption weakens all encryption.
What’s Next for Section 230?
Court interpretation of law is always ongoing, legislation moves with new parties in control, and AI throws monkey wrenches into everything. In the U.S. Congress, we see active efforts to sunset the section unless lawmakers add some reforms to it. Court cases at the Supreme and circuit court levels bring other aspects into the conversation, like the algorithms some platforms use to encourage user engagement or to keep people on the platform. Then we bring in Artificial Intelligence, which courts have ruled isn’t a “person,” but generates content at the behest of the prompter. All of these situations have implications for our daily use of the Internet, so they’re going to be worth watching.
Your Turn
Okay – the platform isn’t you, moderation is protected, and crimes and intellectual-property issues are exceptions. Any change to Section 230 risks changing the whole user-generated World Wide Web, as well as small blogs like mine. Take some time to think about which problems you’d like to see handled by better enforcement of existing laws, and which might call for changing Section 230 itself. Let me know what you think by leaving me a comment. I apply a very light hand to moderating comments 🙂 . (But I don’t tolerate spam.)
If you want to read more, here are a few links, as well as one link on the “keep it” side and one on the “repeal it” side.
47 USC 230: Protection for private blocking and screening of offensive material
30 Years of Section 230: Why We Still Need It for a Safer Internet – Internet Society
Rep Jimmy Patronis PROTECT Act seeks to repeal Section 230
My photography shops are https://www.oakwoodfineartphotography.com/ and https://oakwoodfineart.etsy.com, my merch shop is https://www.zazzle.com/store/south_fried_shop.
Check out my New and Featured page – the latest photos and merch I’ve added to my shops! https://oakwoodexperience.com/new-and-featured/
Curious about safeguarding your digital life without getting lost in the technical weeds? Check out ‘Your Data, Your Devices, and You’—a straightforward guide to understanding and protecting your online presence. Perfect for those who love tech but not the jargon. Available now on Amazon:
https://www.amazon.com/Your-Data-Devices-Easy-Follow-ebook/dp/B0D5287NR3

