House Subcommittee on Energy and Commerce Hosts Less than Festive Parade for Big Tech Accountability

Photo Credit: Getty

The House Energy and Commerce Committee’s Subcommittee on Consumer Protection and Commerce met on Tuesday, March 1, for a legislative hearing on “Holding Big Tech Accountable: Legislation to Protect Online Users.” The hearing coincidently fell on Mardi Gras. And the subcommittee included five pieces of legislation in its King Cake:

  • Banning Surveillance Advertising Act of 2022 (H.R. 6416)
  • Algorithmic Accountability Act of 2022 (H.R. 6580)
  • Cooperation Among Police, Tech, and Users to Resist Exploitation Act (H.R. 6755)
  • Increasing Consumers’ Education on Law Enforcement Resources Act (H.R. 6786)
  • Digital Services Oversight and Safety Act of 2022 (H.R. 6796)

Much like a parade on Fat Tuesday, the hearing included a procession of varying issues: the online drug market float, the cyberbullying float, the facial recognition float, and the Section 230 float. The hearing was less than festive, but two issues stood out.

The Disinformation Float

In most parades, at least one float blares the song “Don’t Stop Believing’” by Journey on repeat. And the hearing was certainly no exception. The subcommittee’s emphasis on disinformation was at max volume. Committee Chair Jan Schakowsky (D-IL) said in her opening statement:

For example, Russian state-owned media is targeting Spanish speakers around the globe with disinformation … when it comes to what is going on in the invasion of Ukraine. Previously, social media abetted a genocide in Myanmar and a deadly insurrection on January 6th.

Disinformation tailored specifically to Spanish speakers was a reoccurring topic of discussion, with Facebook receiving the most notable criticism. Rep. Tony Cardenas (D-CA) said:

Thanks to whistleblowers like Frances Haugen, we know that Facebook directed 87 percent of their investments on combatting misinformation to English language content in spite of that fact only 9 percent of Facebook users are English speakers. What a disparity.

There are conflicting reports on the percentage of English-speaking Facebook users. In 2020, Facebook indicated that less than two thirds of its users speak a language other than English. One social media management platform claims that English-speaking users represent 56 percent of Facebook’s audience.

Other data points also reveal why the English language may garner more investment on the content moderation front. Most importantly, Facebook’s revenue from advertisements and other services overwhelmingly flow from the U.S. and Canada. In the fourth quarter of 2020, the average revenue per user from the U.S. and Canada dwarfed revenue from Europe, the Asia-Pacific, and the rest of the world.

Considering the outpour of criticism directed at Facebook for its handling of the COVID-19 pandemic and the U.S. presidential elections, its no surprise that the company directed considerable resources toward English content.

Facebook’s content moderation in foreign languages is also thought to have contributed to ethnic violence in Myanmar. Two lawsuits, one in California seeking $150 billion and another filed in London, assert Facebook’s liability for the tragedy. Naturally, there are questions surrounding the suits’ viability. It’s also worth considering how Myanmar’s history of ethnic violence influenced recent events.

Rep. Lori Trahan (D-MA) provided her recently introduced legislation, the Digital Services Oversight and Safety Act of 2022 (H.R. 6796), as a solution. She said:

A handful of U.S. companies have become monopolies. They’ve optimized their platforms solely for ad revenue and in turn they’ve become breeding grounds for the spread of weaponized disinformation, hate speech, and content that harms our children. 

The bill would establish the Bureau of Digital Services Oversight and Safety within the Federal Trade Commission (FTC) and create reporting requirements for risk assessment and risk mitigation. Within 18 months of enactment, large covered platforms would be required to provide the FTC with impact assessments associated with content moderation decisions and algorithms.

H.R. 6796 would also require the FTC to hire an additional 500 employees, including 80 “technologists,” 80 “sociotechnical experts,” and 15 constitutional lawyers. Some committee members indicated that the legislation would increase the size of the FTC by nearly one-third, an arguably large expansion of government.

But the proposed FTC beef-up is modest compared to investments made by private enterprise. Facebook has reportedly invested more than $13 billion on safety and security efforts since 2016 and now has over 40,000 employees or contractors assigned to content moderation. And still, some reports say that Facebook makes over 300,000 content moderation mistakes a day. It’s hard to imagine the FTC having technical knowledge or expertise that Facebook lacks.

The challenges involved with content moderation are sure to continue and evolve. Facebook is taking steps to close the purported divide in combating disinformation, and it is not an easy task. What is the best way to accomplish this? No one knows. But it’s clear that the federal government is not the entity best suited to do so.

The Surveillance Advertising Float

Surprisingly, New Orleans prohibits nearly all display and use of advertising during Mardi Gras parades. That means no branded beads, koozies, or stadium cups can be thrown to attendees. Fortunately, the discussion of surveillance advertising did not take such a staunch approach. However, no cookies were thrown by this float.  

Surveillance advertising, also called targeted or behavioral advertising, directs personalized ads to specific users based upon prior online activity and other personal information. This is often done using cookies. The word “surveillance” does sound scary and nefarious, but it’s difficult to pinpoint the exact harm that legislators are attempting to remedy or prevent. Ads can be annoying. That’s why New Orleans banned the practice on Mardi Gras floats. The ads distracted from the cultural and sometimes religious significance of the holiday. To quote the 2010 film, The Social Network: “Ads aren’t cool.”   

The concerns surrounding behavioral advertising are nothing new. However, privacy between consenting parties is not something for Congress to legislate. Rep. Anna Eshoo (D-CA) waded into the hearing to speak about her legislation, the Banning Surveillance Advertising Act of 2022 (H.R. 6416). 

My bill does go after the root of the social media problem, which is a toxic business model. Critics say there can’t be an Internet economy without surveillance ads. They’ve really poured it on. … But I view it another way. And I think so does DuckDuckGo, because it’s a counterexample. 

The search engine DuckDuckGo was heralded throughout the hearing for its privacy-based business model, and presented as what should become the standard. The company should be commended for its splash in the search engine space. Founded in 2008, DuckDuckGo describes itself as a “privacy technology company” that offers “seamless protection from surveillance ads by blocking trackers.” Since then, it has become the fourth largest search engine in the U.S., with over 3 billion searches a month globally. 

Rep. Eshoo is correct on one point. DuckDuckGo is a “counterexample.” Most importantly, it’s an example of a free market solution. It provides an alternative, and consumers choose to use it. The playground game Duck, Duck, Goose has a purpose: choice. The Banning Surveillance Advertising Act of 2022 would eliminate that choice and make every search engine like DuckDuckGo. And what would be the point of playing Goose, Goose, Goose?  

This Mardi Gras parade was not the jolliest of celebrations to occur this past Fat Tuesday. And, perhaps fortunately, it was not the most consequential. But it does signal lawmakers’ ever-increasing focus on large technology platforms and their business models.