Reynaldo Gonzales sued Google under the Anti-Terrorism Act for the death of his daughter during an ISIS attack in Paris in 2015 because YouTube didn’t just host terrorism recruitment videos (clearly protected under Section 230), but recommended that content via algorithms based on the users’ other interests. Lower courts found that Section 230 still protected YouTube from liability, but the U.S. Court of Appeals for the Ninth Circuit expressed some hesitation with that finding, leaving the question sufficiently open to Supreme Court review.
Section 230(c)(1) reads:
No provider or user of an interactive computer service shall be treated as the publisher of or speaker of information provided by another content provider.
This is plain language for the digital platform not being responsible for content posted by a third party. If you post something on Twitter, you are legally responsible for that content and Twitter is not. That rule holds even if Twitter is actively engaged in removing other third-party content that, for whatever reason, it doesn’t want to host (See The First Amendment). But Gonzales questions lower court’s interpretation of that liability shield also applying to platforms when said content was recommended to a viewer by algorithmic activity.
Algorithmic recommendation is so ubiquitous and fundamental to shaping the character of platforms and their users’ experience that it’s hard to see how one could carve that out of Section 230. The vast majority of social media platforms do not operate as dumb pipes, but curate their content. Their platform hosting expressive content is the basis of their revenue stream and intrinsic to their business model.
A reversal of lower court decision in Gonzales would incentivize platforms to remove (and not recommend) far more third-party content. That’s especially wild because the Court might also agree to take a case resolving the spilt between the Eleventh Circuit Court of Appeals striking down a Florida law that would prevent platforms from taking down certain kinds of content and a Fifth Circuit Court decision that upheld a similar Texas law. Those state laws, which were driven by concerns of discrimination against conservative third-party content, largely ignore the existence of Section 230.
Making platforms liable for third-party content while letting states restrict what content platforms are legally able to remove makes little sense. Let’s hope that principles prevail over politics as the Court renders its decisions.