Facebook Leaks Are Hardly Newsworthy

Photo Credit: Getty

The frenzied media coverage of Facebook document leaks seems to confuse hosting disagreeable content with the platform being the cause of humanity’s ills. Worries by Facebook employees that the company is aware of its products’ downsides and hasn’t solved every problem aren’t terribly newsworthy.

Even the most casual student of history can attest that humans had already stumbled on radicalization, calls for ethnic cleansing, racist language, misinformation, and hateful depictions of violence well before the rise of “TheFacebook” in 2004. Has the social media site, on balance, really made all of those problems worse? How that does balance against the platform’s role in facilitating millions of people expressing themselves, the unprecedented flow of useful information and the countless friendships made among these same fallen humans? As with all forms of technology, the question isn’t if there are good and bad aspects of the rise of social media, but who is ultimately responsible for improving the ratio.

For those who value liberty and individual rights, that answer cannot be a government regulatory one. The First Amendment’s explicit prohibition of government regulation of speech is one of the very foundations of this country. A quick survey of past government abuses throughout history and today around the world confirm the timeless wisdom of not letting the fox guard the henhouse.

So to what extent is Facebook responsible for preventing the darker side of its 2.89 billion human users from reaching the platform? That scale is daunting at best and suggests the impossibility of patrolling the platform perfectly. And then there are the difficulties of deciding what is harmful and what is important to chronicle, what is opinion and what is misinformation, or what is humorous and what is hate speech. In the same way that Netflix is struggling with Dave Chappelle and questions about balancing profits, artistic liberty, and internal employee values, Facebook faces that conundrum countless times a day with 510,000 comments and 136,000 photos uploaded to the platform every 60 seconds. Content moderation at scale is hard. Really hard.

If few are surprised to learn that Facebook is aware of these challenges, it might be news to many that the company spends so much time, energy, and money studying the problems. But that’s not a bad thing. It isn’t morally different from the auto industry crash testing cars to decrease the number of drivers killed in car accidents every year. Trying to solve problems and improve product is one way a company succeeds and innovates; it’s one of the benefits of a market system.

The media coverage has highlighted emails from individual employees as evidence of internal strife at the company. But it’s reasonable that employees within Facebook would have varying levels of concern about the problems amplified on the platform. That’s to be expected within a company with more than 60,000 employees, tasked to focus on different aspects of the business. Robust discussion and passionate opinions suggest concern not scandal.

And Facebook considering profits in its calculations is hardly scandalous. A firm’s primary duty is to deliver profits to its shareholders. Facebook does not exist as a charitable entity; profits are its primary signal that what it’s doing is or is not working. Value to shareholders is a social benefit in and of itself. To relegate profits below other goals puts those benefits, the survival of the company, and those whose livelihoods depend on it at risk.

The media cycle has reveled that Facebook has problems, works hard to solve them, doesn’t always succeed, and takes making money seriously. That sounds a lot like every other major corporation in America. The real difference is that Facebook is one of a handful of companies in the regulatory crosshairs of both political parties at both the state and federal level. Regulation that stifles Facebook’s ability to innovate its way out of problems is the real danger to consumers online.