A blueprint for digital censorship in the US?

Photo Credit: Getty

Internal documents from the Center for Countering Digital Hate (CCDH), as revealed by digital censorship journalist Matt Taibbi, showed that the group’s primary focus throughout much of 2024 was to “Kill Musk’s Twitter.” The NGO may not have been successful in squashing Elon Musk or X ahead of the US election, but elected representatives looking to preserve free speech stateside may find another of the group’s priorities problematic.

Taibbi’s reporting shows that “progress towards change in the USA and support for STAR” was ranked fourth on CCDH’s monthly agenda throughout the past year. The STAR framework from CCDH consists of four elements: “Safety by design,” “Transparency,” “Accountability to democratic and independent bodies,” and “Responsibility for companies and their senior executives.”

The group has already successfully used that framework to influence policy, shaping the UK’s Online Harms Act and the EU’s Digital Services Act. Those laws have since been used, among other things, to target Elon Musk’s social media platform X for hosting “hate speech” and “misinformation.”

Happily, the First Amendment presents constitutional hurdles to implementing the CCDH’s STAR framework here in the US. But that might not have stopped some from trying. 

The Kids Online Safety Act (KOSA) is a bill pending in Congress that would create a “duty of care” for online platforms, put the Federal Trade Commission in charge of enforcement, and likely trigger age verification for all users, not just minors. Advocates of the measure say it will keep children safer online, while opponents worry it will have harmful consequences for free speech. KOSA passed the Senate with overwhelming support earlier this year. Its companion bill is still making its way through the House, with questionable chances for success.

But before the House takes further action on the bill, lawmakers should know that much of CCDH’s STAR framework could have been the blueprint for KOSA.

In a side-by-side analysis of the CCDH’s STAR framework and the House version of KOSA, the similarities are remarkable. Not only do the two documents share similar content in their recommendations, but they also present those recommendations and requirements in the same order.

Perhaps the STAR framework was merely the inspiration for KOSA, but it’s not unreasonable to conclude that STAR might have been the specific structural blueprint for KOSA. Either way, it’s easy to imagine that KOSA will be used to suppress speech in the same ways that other laws built on the STAR framework have been used in other countries to suppress speech. 

Keeping kids safe online is a noble and important goal but giving government bureaucrats control over speech is not the American way. US lawmakers would be wise to consider how similar schemes have been used in other countries before voting for versions of them here.

STAR Framework and KOSA side-by-side

STARKOSA
Section 3. Safety by Design (p. 7) 3c. Policy Solutions (p. 10-12) Reorient the Product Design Process   Enshrine duties of care to address online harmsSec. 102. Duty of Care    
Set standards for safe product design.   [S]tandards should require services to enable their strongest privacy settings and content filters by default.Sec. 102(a) PREVENTION OF HARM TO MINORS.—A high impact online company shall create and implement its design features to reasonably prevent and mitigate the following harms to minors[. . .]   Sec. 103(a). SAFEGUARDS FOR MINORS (1) SAFEGUARDS.—[. . .] [A] covered platform shall provide [. . .] the following: [. . .] (B) Limit design features, such as infinite scrolling, auto playing, rewards or incentives [. . .], notifications, badges, push alerts, and other design features [. . .]. (3) DEFAULT SAFEGUARD SETTINGS FOR MINORS.—A covered platform shall provide that, [. . .] the default setting for any safeguard described under paragraph (1) shall be the option available on the platform that provides the most protective level of control that is offered [. . .].
Mandate risk assessments and mitigation plans.   Risk assessments involve systemic evaluation of the effects social media platforms have on users’ health and wellbeing. Risk mitigation plans should clearly enumerate steps to address the identified risks.Sec. 105 TRANSPARENCY.   Sec. 105(a). [N]ot less frequently than once a year, a covered platform shall issue a public report that describes the reasonably foreseeable risks of harms to minors and assesses the prevention and mitigation measures taken to address such risks based on an independent, third-party audit [. . .].    
Independent audits by third parties and regulators.   Risk audits enable an added layer of scrutiny of a digital service’s design. For audits to be effective, trusted third parties must have enough information to understand the inner workings of a product.Sec. 105(e). COOPERATION WITH INDEPENDENT, THIRD- PARTY AUDIT.—[. . .] [A] covered platform shall— (1) provide or otherwise make available to the independent third-party conducting the audit all information and material in the possession, custody, or control of the platform that is relevant to the audit; (2) provide [. . .] all network, systems, and assets relevant to the audit; and (3) disclose all relevant facts to the independent third-party conducting the audit[. . .].
Empower Users   Restrict the usage of manipulative design features, including deceptive engagement patterns, also known as ‘dark patterns.Sec. 103. (e)(2) DARK PATTERNS PROHIBITION
Incentivize features that encourage healthier forms of engagement.   Social media platforms can nudge users into taking breaks and reconsidering abusive posts. These features should become an integral part of the user experience.Sec. 103(a)(1)(B) Limit design features, such as infinite scrolling, auto playing, rewards or incentives [. . .], notifications, badges, push alerts, and other design features[. . .].   Sec. 103(b) PARENTAL TOOLS TOOLS. REQUIREMENTS. —The parental tools provided by a covered platform under paragraph (1) shall include— (C) [Restrict time spent on the covered platform by the minor. DEFAULT TOOLS.—A covered platform shall provide that[. . .] the tools required under paragraph (1) shall be enabled by default.