Executive Order on Social Media Threatens Property Rights and Free Speech
Today’s Executive Order on Section 230 liability protections for online platforms violates the First Amendment and property rights of social media companies, contradicts the most relevant case law, and ignores the actual language of Section 230 of the Communications Decency act. Here are some examples from the text and brief comments:
“The emergence and growth of online platforms in recent years raises important questions about applying the ideals of the First Amendment to modern communications technology.”
It really doesn’t, but the Order is rife with confused First Amendment pontificating anyway. These large tech platforms are private property. The First Amendment protects citizens against the suppression of speech by the government, which, ironically, is just what this executive order would institute on the platforms. Just as you do not have the right to hold a political rally in your neighbor’s front yard without permission, neither does an online poster have the right to insist that Twitter, Facebook, or YouTube host all content.
“…these platforms function in many ways as a 21st-century equivalent of the public square.”
The courts beg to differ. The Ninth Circuit Court of Appeals said as much in its February 2020 decision in Prager University v. Google: “Despite YouTube’s ubiquity and its role as a public facing platform, it remains a private forum, not a public forum subject to judicial scrutiny under the First Amendment.” That decision quoted a recent Supreme Court case that held, “merely hosting speech by others is not a traditional, exclusive public function and does not alone transform private entities into state actors subject to First Amendment constraints.” Strangely, this most relevant case law was absent from the Order while less relevant cases were cited.
“[The Section 230 (c) (2)] provision does not extend to deceptive or pretextual actions restricting online content or actions inconsistent with an online platform’s terms of service.”
Section 230 doesn’t say anything about platforms only having liability protection for items that are consistent with its own terms of service. There is no basis for this standard in the law as currently written.
Section 230 protects platforms from lawsuits over third-party posts by recognizing that these hosts don’t proactively supply this content like a newspaper or the generator of the content does. This acknowledgement of the fundamental difference between traditional publishers and online platforms enables the mostly permissionless, user-driven Internet experience we enjoy online today.
Section 230’s liability protections apply even if a platform has content standards or acts to remove materials in violation of those rules, a point that needed clarifying when the law was passed in 1996. Without this tenet, platforms would have to vet the enormous amount of user-driven content before displaying it because of the increased liability they would assume. This would be time-consuming and expensive to a prohibitive extent.
When an interactive computer service provider removes or restricts access to content and its actions do not meet the criteria of subparagraph (c)(2)(A), it is engaged in editorial conduct … [and it] forfeits any protections from being deemed a “publisher or speaker. …”
Subparagraph (c)(2)(A) of Section 230 says that a platform enjoys liability protection when it acts to restrict access to content that it considers to be “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.” Otherwise objectionable is the catch-all term that grants broad immunity to platforms. That phrase is so all-encompassing that it would be absurd to think it couldn’t include misleading statements, unsubstantiated claims, opinions, or outright lies.
The law as written makes it almost impossible for a platform to lose its liability protection for third-party data. The subjective judgement on when those incidents occur would be a politicized mess for any regulator to decide. Objectivity about what falls outside the scope of otherwise objectionable would be nearly impossible.
For the reasons discussed above and others, it’s likely courts will not uphold the Order’s legality. But in the meantime, the Order will be a detriment to these online platforms and their users. The approach is bad for property rights, and for free speech, and bad for innovation in an industry that’s more critical now than ever.
(This commentary refers to the draft version of the Order.)