CDA 230 and the Ejusdem Generis rule
We don't need to "reform" Section 230, we just need to apply it exactly as written.
Imagine you read an article in which an astronomer talked about a phenomenon observed in “planets, asteroids, and other large objects.” And then you heard someone say “a bus is a pretty large object, so this phenomenon applies to buses too.”
If you can understand why that’s obviously a ridiculous thing for them to say, you have an intuitive grasp of an important rule of legal interpretation known as Ejusdem Generis. The Latin name means “of the same kind,” and its’s typically formally stated like this: when a general continuation [ie. “and other examples”] follows a list of specific examples, the continuation must only be interpreted as referring to other examples of the same kind as the things the specific examples were referring to. (So in the example above, a bus doesn’t work because it’s not a celestial body like a planet or an asteroid, but a comet would fit the requirements.)
This rule is typically used by courts in the interpretation of laws and contracts, to keep parties from abusing general continuations by turning them into infinitely-scoped catchalls that can apply to whatever somebody might dream up in a frivolous (or malicious) legal proceeding.
Hold onto that concept. It’ll be relevant soon.
What is CDA 230?
There’s been a lot of talk over the past few years about a law referred to as “CDA 230” (or simply “section 230”) and what, if anything, should be done with it given that so many Big Tech companies are using it as legal cover to censor conservatives. This has provoked a wide variety of opinions on the subject. So let’s look at what it is and where it came from.
Back in 1996, Congress passed the Communications Decency Act, a common-sense piece of legislation that imposed the same basic anti-obscenity rules upon internet communications that television had been subject to for decades. (Contrary to what some have said, it did not ban pornography on the Internet. What it did was require anyone distributing pornography to take verification measures to make sure that it was not available to children, much like a clerk in a physical store might check the ID of someone attempting to buy pornographic movies or magazines.) It also imposed some basic rules regarding harassing communications, not too different from existing laws regarding harassing phone calls — and remember that in the mid-90s, virtually all home internet connections ran over phone lines anyway — and excessively violent content.
It passed through Congress with overwhelming bipartisan consensus in both houses, and was signed into law by President Clinton. And why shouldn’t it? It made perfect sense.
So of course Leftist activists decided it had to go.
The ACLU immediately — literally the very same day — sued with the spurious theory that 1) adults have a First Amendment right to access porn if they want to, 2) age verification might prohibit some legally valid adult somewhere from getting at the porn they want to see, 3) so therefore the whole thing has to be thrown out, in the name of free speech or whatever. Oh, and also the terms they used were “vague” and “not well defined,” despite having well-understood legal meanings elsewhere in statute and law.
Unfortunately they managed to persuade the Supreme Court of this nonsense. Read the decision sometime, it’s a real piece of work. It contains such gems as “the interest in encouraging freedom of expression in a democratic society outweighs any theoretical but unproven benefit” of blocking minors from accessing porn — conspicuously ignoring sociological research that was pretty well-established already in 1997 and has only gotten more conclusive since — and “[c]ommunications over the Internet do not ‘invade’ an individual’s home or appear on one’s computer screen unbidden. Users seldom encounter content ‘by accident.’ ” (Yes, popups and spam both existed back then, and anyone who knew anything about technology could already see they were problems that were getting worse, not better, as time went on.)
So the Supreme Court issued a disastrous ruling, flying in the face of multiple established precedents saying that it was legitimate for the government to regulate obscenity, and torched to the anti-obscenity provisions of the CDA. One of the parts it left intact, though, was section 230 of the CDA, which was about private action rather than government regulation.
Section 230 came about as a response to some concerning case-law precedents in the early days of the Internet. There had been two notable court cases involving online discussion forums and defamation liability. In the first one, somebody posted libelous material on an unmoderated CompuServe forum, the injured party sued CompuServe (not the poster,) and the case got thrown out because the forum was unmoderated, and therefore CompuServe had no knowledge of the material so it would make no sense to consider them responsible for it.
In the second case, somebody posted libelous material on a Prodigy forum, the injured party sued Prodigy, but this time they won, because Prodigy moderated their forums, and the argument was raised that anything the mods didn’t take down, Prodigy had therefore officially approved of, and thus assumed legal liability for much like the editor of a newspaper assumed legal liability for the content of the articles they approve.
So when faced with one decision that said “unmoderated content = no liability” and another that said “moderated content = full liability,” you get a really ugly perverse incentive for people running forums to not have any mods at all, which would end up leading to even more bad content, but that bad content isn’t the forum administrator’s problem as long as they don’t try to fix it. This is clearly a terrible idea, so Section 230 was created to fix it. There are several points to it, but here’s the relevant stuff:
The Internet and other interactive computer services offer a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity.
It is the policy of the United States … to promote the continued development of the Internet and other interactive computer services and other interactive media
No provider or user of an interactive computer service shall be held liable on account of … any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected
(Direct quotes from the law, not paraphrased.)
That’s Not What It’s Supposed To Do
So let’s recap. Section 230 was created as a part of an anti-obscenity law, but it wasn’t meant to deal with obscenity. Consistent with Congress’s chronic inability to stay on topic, it was meant to deal with spurious defamation cases where people who had been defamed by a third party went after forum administrators rather than the person who actually defamed them. Except that the language doesn’t talk about defamation at all; it talks about protecting forums whose moderators take down material that is “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable,” in the middle of a law regulating material that is obscene, lewd, lascivious, filthy, excessively violent, harassing, etc. But then the pieces of the law that regulated said material were judicially-opinioned out of existence. Confused enough yet?
Then about 20 years later, under immense Internet pressure to Do Something™ after Donald Trump’s electoral victory, social media companies started censoring mainstream conservative opinions that extreme-left activists called “hateful” and “harmful.” When called on it, they started pointing to Section 230, abusing a law designed to immunize platforms against spurious defamation charges which is not what anybody is actually claiming here, and claiming they had legal immunity for their actions because the content they took down was “otherwise objectionable.” In response, a lot of Republicans in Congress have been saying that because Section 230 is providing legal cover to Leftists at Big Tech companies to censor us, we should throw it out.
Not so fast. Remember the Ejusdem Generis rule? It tells us that the general wording “otherwise objectionable” must be interpreted only as including things “of the same kind” as the specific examples given. And what kind of things did Section 230 use as its specific examples?
Content that was regulated under the CDA. Content that is considered inherently offensive and has a long history of being subject to regulation due to its harmful-per-se nature. “Political opinions I don’t like” are not part of this “same kind.” Indeed, the CDA elsewhere specifically forbids material from being classified as obscene or objectionable on the basis of “political or religious content,” and Section 230 itself states as its rationale for existing that the Internet provides “a forum for a true diversity of political discourse” and we want to actively encourage more of that.
All of this is antithetical to the politically-motivated censorship that Leftists dishonestly invoke CDA 230 to justify.
Just Follow The Text
Section 230 has done a lot of good. By rectifying the disastrous precedents set by the earlier defamation cases, it’s made it possible for free communication to exist online. Any site that runs a forum, any site that runs a comments section, any site (like Substack, Medium, or Blogger) that provides a platform to let end-users publish their own material, all of that owes its existence to CDA 230. Getting rid of it would be disastrous; we don’t want to throw the baby out with the bathwater.
What is needed is to interpret it exactly as written. The principle of Ejusdem Generis is crystal clear here: they have no claim whatsoever to Section 230 protection for politically-motivated censorship.
We don’t need to get rid of Section 230. We just need to actually enforce it. And rolling back Reno v. ACLU would really help as well.
Let's collaborate?