The Danger in Blurring the Private and Public Boundaries with Government Regulation

Photo Credit: Getty

The recent decisions of many technology companies to remove users and customers from their platforms have deeply divided Americans. Many Americans feel censored and discriminated against, fearing their opinions are vanishing from the public discourse, while others feel these de-couplings are not only justifiable, but long overdue. What both sides should worry most about is the harm codifying these private choices into regulations would bring to all Americans. There’s real danger in blurring the private and public boundaries with government regulation.

If these steps to ban individuals and unpopular ideas are signals to the next Congress and administration that “big tech” will accept regulation in return for favorable regulatory treatment for incumbent market leaders and inevitably erecting barriers to entry for nascent tech companies, innovation will slow and U.S. consumers will suffer. The more the frustration from social media users grows over content moderation, distribution in app stores and cloud hosting, the more financial incentive there is to provide alternatives.

This is how solutions emerge from market processes, like those currently being perceived by many as discrimination. But locking the industry into regulations that established tech firms can afford to comply with (and very likely agree with anyway) means that alternatives cannot emerge. This gives established firms an incentive to seek regulation. Their willingness to work with oppressive regimes in other countries points to a preference to work with regulators whatever is asked of them. 

The most immediate regulatory threat to a potential market solution of new social media platforms with alternative content moderation standards is the repeal of Section 230 of the Communications Decency Act. Contrary to what many advocates of repeal think, platforms would still be free to take down any content they wished because of their own First Amendment right not to be forced to carry speech. If anything, removing the liability shield of Section 230 would incentivize platforms to take down more, not less, content in order to avoid the risk of costly legal battles. This heavy-handed solution would also adversely affect third-party review sites like Yelp and TripAdvisor, which are currently protected by Section 230.

Other regulatory threats ahead may involve antitrust expansion, privacy regulations at both the state and federal level, new data laws, and the revival of net neutrality laws. All of these make market solutions more difficult and less frequent. They exchange temporary market imperfections for permanent regulatory stagnation. 

It would, however, be a mistake to assume that the future of speech online necessarily looks like current models. There is significant potential for innovators to disintermediate current big players using distributed technology solutions such as open-source self-hosted platforms (like Mastodon). Many such solutions use blockchain and cryptocurrency. Congress and regulators should make sure not to strangle these technologies in their infancy with heavy-handed regulation (such as treating cryptocurrency as a security). Innovators can use these technologies to provide media solutions that preserve the free exchange of ideas while giving users control over moderation.

Unfettered market forces are far more capable of producing solutions and options than politicians and bureaucrats are. Let’s hope that point doesn’t get lost in the political anger of the moment.