Monday, October 19, 2020

Facebook's Dilemma

 

This week's New Yorker carried an article by Andrew Marantz whose main thrust was that Facebook is not doing a good job of moderating its content.  The result is that all sorts of people and groups that, in the view of many experts the reporter interviewed, should not be able to use the electronic megaphone of Facebook, are allowed to do so.  The list of such offenders is long:  the white-nationalist group Britain First; Jair Bolsonaro, "an autocratic Brazilian politician"; and of course, the President of the United States, Donald Trump. 

 

Facebook has an estimated 15,000 content moderators working around the world, constantly monitoring what its users post and taking down material that violates what the company calls its Implementation Standards.  Some decisions are easy:  you aren't allowed to post a picture of a baby smoking a cigarette, for example.  But others are harder, especially when the people doing the posting are prominent figures who are likely to generate lots of eye-time and thus advertising revenue for the company. 

 

The key to the dilemma that Facebook faces was expressed by former content moderator Chris Gray, who wrote a long memo to Facebook CEO Mark Zuckerberg shortly after leaving the company.  He accused Facebook of not being committed to content moderation and said, "There is no leadership, no clear moral compass."

 

Technology has allowed Facebook to achieve what in principle looks like a very good thing:  in the words of its stated mission, "bring the world closer together."  Unfortunately, when you get closer to some people, you wish you hadn't.  And while Zuckerberg is an unquestioned genius when it comes to extracting billions from a basically simple idea, he and his firm sometimes seem to have an oddly immature notion of human nature.

 

Author Marantz thinks that Facebook has never had a principled concern about the problem of dangerous content.  Instead, what motivates Facebook to take down posts is not the content itself, but bad publicity about the content.  And indeed, this hypothesis seems to fit the data pretty well.  Although the wacko-extremist group billing itself QAnon has been in the news for months, Facebook allowed its presence up until only last week, when public pressure on the company mounted to an apparently intolerable level. 

 

Facebook is a global company operating in a bewildering number of cultures, languages, and legal environments.  It may be instructive to imagine a pair of extreme alternatives that Facebook might choose to take instead of its present muddle of Implementation Standards, which makes nobody happy, including the people it bans. 

 

One alternative is to proclaim itself a common carrier, open to absolutely any content whatsoever, and attempt to hide behind the shelter of Section 230 of the Communications Decency Act of 1996.  That act gives fairly broad protection to social-media companies from being held liable for what users post.  If you had a complaint about what you saw on Facebook under this regime, Facebook would tell you to go sue the person who posted it. 

 

The problem with this approach is that, unlike a true common carrier like the old Ma Bell, which couldn't be sued for what people happened to say over the telephone network, Facebook makes more money from postings that attract more attention, whether or not the attention is directed at something helpful or harmful.  So no matter how hard they tried to say it wasn't their problem, the world would know that by allowing neo-Nazis, pornographers, QAnon clones, terrorists, and whatever other horrors would come flocking onto an unmoderated Facebook, the company would be profiting thereby.  It is impossible to keep one's publicity skirts clean in such a circumstance.

 

The other extreme Facebook could try is to drop the pretense of being a common carrier altogether, and start acting like an old-fashioned newspaper, or probably more like thousands of newspapers.  A twentieth-century newspaper had character:  you knew pretty much the general kinds of stuff you would see in it, what point of view it took on a variety of questions, and what range of material you would be likely to see both in the editorial and the advertising sections.  If you didn't like the character one paper presented, you could always buy its competing paper, as up to the 1960s at least, most major metropolitan areas in the U. S. supported at least two dailies. 

 

The closest thing the old-fashioned newspaper had to what is now Facebook was the letters-to-the-editor section.  Nobody had a "right" to have their letter published.  You sent your letter in, and if the editors decided it was worth publishing, they ran it.  But it was carefully selected for content and mass appeal.  And not just anything got in.

 

Wait a minute, you say.  Where in the world would Facebook get the dozens of thousands of editors they'd need to pass on absolutely everything that gets published?  Well, I can't answer all your questions, but I will present one exhibit as an example:  Wikipedia.  Here is a high-quality dynamically updated encyclopedia with almost no infrastructure, subsisting on the work of thousands of volunteers.  No, it doesn't make money, but that's not the point.  My point is only that instead of paying a few thousand contract workers to subject themselves to the psychological tortures of the damned in culling out what Zuckerberg doesn't want to show up, go at it from the other end. 

 

Start by saying that nobody gets to post on Facebook unless one of our editors has passed judgment on it.  When the nutcases and terrorists of the world see their chances of posting dwindling to zero reliably, they'll find some other Internet-based way to cause trouble, never fear.  But Zuckerberg will be able to sleep at night knowing that instead of paying thousands of people to pull weeds all the time, he's started with a nice sterile garden and can plant only the flowers and vegetables he wants to.  And he'd still be able to make money.

 

The basic problem Facebook faces is that they are trying to be moral with close to zero consensus on what moral is.  At least if the company was divided up into lots of little domains, each with its clearly stated and enforced standards, you would know more or less what to expect when you logged into it, or rather, them. 

 

Sources:  The article "Explicit Content" by Andrew Marantz appeared on pp. 20-27 of the Oct. 19, 2020 issue of The New Yorker.

No comments:

Post a Comment