My late colleague, Neil Postman, used to ask about any new proposal or technology, “What problem does it propose to solve?”
When it comes to Facebook, that problem was maintaining relationships over vast time and space. And the company has solved it, spectacularly. Along the way, as Postman would have predicted, it created many more problems.
Last week, Facebook revealed the leaders and first 20 members of its new review board. They are an august collection of some of the sharpest minds who have considered questions of free expression, human rights, and legal processes.
They represent a stratum of cosmopolitan intelligentsia quite well, while appearing to generate some semblance of global diversity. These distinguished scholars, lawyers, and activists are charged with generating high-minded deliberation about what is fit and proper for Facebook to host. It’s a good look for Facebook—as long as no one looks too closely.
What problems does the new Facebook review board propose to solve?
In an op-ed in The New York Times, the board’s new leadership declared: “The oversight board will focus on the most challenging content issues for Facebook, including in areas such as hate speech, harassment, and protecting people’s safety and privacy. It will make final and binding decisions on whether specific content should be allowed or removed from Facebook and Instagram (which Facebook owns).”
Only in the narrowest and most trivial of ways does this board have any such power. The new Facebook review board will have no influence over anything that really matters in the world.
It will hear only individual appeals about specific content that the company has removed from the service—and only a fraction of those appeals. The board can’t say anything about the toxic content that Facebook allows and promotes on the site. It will have no authority over advertising or the massive surveillance that makes Facebook ads so valuable. It won’t curb disinformation campaigns or dangerous conspiracies. It has no influence on the sorts of harassment that regularly occur on Facebook or (Facebook-owned) WhatsApp. It won’t dictate policy for Facebook Groups, where much of the most dangerous content thrives. And most importantly, the board will have no say over how the algorithms work and thus what gets amplified or muffled by the real power of Facebook.
This board has been hailed as a grand experiment in creative corporate governance. St. John’s University Law Professor Kate Klonick, the scholar most familiar with the process that generated this board, said, “This is the first time a private transnational company has voluntarily assigned a part of its policies to an external body like this.”
That’s not exactly the case. Industry groups have long practiced such self-regulation through outside bodies, with infamously mixed results. But there is no industry group to set standards and rules for Facebook. One-third of humanity uses the platform regularly. No other company has ever come close to having that level of power and influence. Facebook is an industry—and thus an industry group—unto itself. This is unprecedented, though, because Facebook ultimately controls the board, not the other way around.
We have seen this movie before. In the 1930s the Motion Picture Association of America, under the leadership of former U.S. Postmaster General Will Hays, instituted a strict code that prohibited major Hollywood studios from showing, among other things, “dances which emphasize indecent movements.” The code also ensured that “the use of the [U.S.] flag shall be consistently respected.” By the 1960s, American cultural mores had broadened and directors demanded more freedom to display sex and violence. So the MPAA abandoned the Hays code and adopted the ratings system familiar to American moviegoers (G, PG, PG-13, R, NC-17).
One reason the MPAA moved from strict prohibitions to consumer warnings was that American courts had expanded First Amendment protection for films, limiting how local governments could censor them. But all along, the MPAA practiced an explicit form of self-regulation, using a cartel that represented the interests of the most powerful studios to police behavior and represent the industry as a whole to regulators and the public.