Facebook’s decision to refer the decision to the oversight board — which is intended to effectively function as a “Supreme Court” reviewing the company’s most controversial content moderation decisions — reflects the polarizing opinions on how the world’s largest social network should handle an ex-president Trump. Facebook indefinitely suspended Trump’s accounts after his rhetoric incited a mob of his supporters to violently storm the U.S. Capitol. Trump’s accounts will remain suspended until the oversight board makes a decision.
“We believe our decision was necessary and right,” Nick Clegg, Facebook’s communications and policy chief, wrote in a blog post yesterday. “Given its significance, we think it is important for the board to review it and reach an independent judgment on whether it should be upheld.”
Facebook’s decision to outsource one of the toughest content moderation questions to the board is controversial.
The board — which is funded by Facebook — is a response to criticism that CEO Mark Zuckerberg and the company’s executives have too much power over what people see online. Punting the question of Trump’s account to the body ensures more experts are involved, but it also insulates the company’s top executives and engineers from some of the public relations fallout of a decision that is sure to generate strong backlash.
Facebook spent years developing a board of roughly 40 outside experts from around the world — ranging from a former prime minister to a Nobel Peace Prize winner. The outside panel has the power to review and overturn its decisions but until now it was unclear if Facebook would seek the board’s guidance on key issues influencing American politics. The company was widely criticized last year for saying the board would not be reviewing decisions during the U.S. presidential election.
Some experts saw Facebook’s move as a positive step, underscoring the board would have real power over content moderation at the company and Facebook would turn to its guidance for the very thorny questions it was established to address.
“Before the board stepped in, the bottom line on Trump’s account was simply that [Zuckerberg] would decide what to do,” wrote Evelyn Douek, a lecturer on Law and S.J.D. candidate at Harvard Law School, and affiliate at the Berkman Klein Center for Internet & Society in an article for Lawfare. “If you believe that ‘Mark decides’ is a bad governance model for the future of speech online — regardless of whether Mark occasionally happens to decide correctly — this referral is good news.”
But others were skeptical of Facebook’s move, seeing it as the company punting responsibility.
One of the company’s critics said it may ultimately allow Facebook to reinstate Trump, following advocates’ calls to permanently suspend him from the platform as Twitter did.
“Given the court-like structure and quasi-legal approach of the Facebook oversight board, I would expect them to favor a reinstatement of Trump,” Roger McNamee, a former Facebook investor who has become a prominent critic, told my colleague Elizabeth Dwoskin in an interview. “That outcome would provide Facebook with the perfect cover to position the Trump ban as an appropriate response to a clear and present danger, not a policy to be applied broadly to world leaders.”
The board’s decision is so much bigger than just Trump’s account.
The oversight board launched last year to make the final calls on some of the social network’s most controversial content moderation decisions. And the way it handles the Trump dilemma could have implications for how future decisions are made at Facebook over political leaders’ accounts. It also could impact how much the social network takes real-world events occurring off its platform into account when making decisions.
“The platform’s decision to ban Trump was more a result of the events at the Capitol and surrounding political context than anything the president had posted directly prior to the suspension,” Douek wrote. “Taking context into account is the only way to effectively evaluate speech — but how should Facebook assess that context, and when does context require suspending an account whose posts on the platform itself might technically remain within the rules?”
It also could influence how Facebook handles the accounts of other political leaders, though the board’s policy recommendations are not binding.
The new board is a major experiment in content moderation.
It could have an impact on how other companies create systems to determine what speech stays on their platforms, and the options users have to appeal those decisions. It also could influence regulatory debates in Washington and abroad about the future of content moderation.
The board will have 90 days to review the decision, and Trump will have the opportunity to submit a statement about how the company handled the suspension.
From Kate Klonick, an assistant professor at St. John’s University School of Law who has studied the board:
Our top tabs
A top lawmaker has asked the FBI to probe Parler’s role in the Jan. 6 riot at the Capitol.
House Oversight Committee Chairwoman Carolyn B. Maloney has asked FBI Director Christopher A. Wray to investigate the Capitol riot and the role of the alternative social media network Parler, Tom Hamburger and Craig Timberg report. The committee, according to Maloney (D-N.Y.), will begin formally investigating Parler and similar sites.
“I am going to get to the bottom of who owns and funds social media platforms like Parler that condone and create violence,” she told our colleagues in an interview.
In a statement, Parler Chief Operating Officer Jeffrey Wernick said “like other social-media platforms, we have been cooperating and will continue to cooperate with law-enforcement efforts to identify and prosecute those individuals responsible for organizing and carrying out the shameless Jan. 6 attack on the Capitol. Parler welcomes Rep. Maloney’s call to have the Federal Bureau of Investigation conduct a robust examination of our policies and actions.”
Meanwhile, a federal judge dealt Parler a legal setback when she denied the company’s request for a preliminary injunction against Amazon, its former web services provider. (Amazon founder and chief executive Jeff Bezos owns The Washington Post.)
“To be clear, the Court is not dismissing Parler’s substantive underlying claims at this time,” wrote Judge Barbara Jacobs Rothstein. “Parler has fallen far short, however, of demonstrating, as it must, that it has raised serious questions going to the merits of its claims, or that the balance of hardships tips sharply in its favor.”
Biden named Rebecca Kelly Slaughter acting chairwoman of the FTC, and Jessica Rosenworcel acting chairwoman of the FCC.
The moves at two of the country’s top technology regulators signal the political shift underway in Washington, as Democrats intend to undo a series of Trump’s deregulatory moves, my colleague Tony Romm reports. But Slaughter and Rosenworcel will still face early obstacles because new vacancies at the Federal Trade Commission and Federal Communications Commission may leave it deadlocked at two Democrats and two Republicans.
Slaughter has served as Democratic commissioner at the FTC since 2018. She inherits the agency’s lawsuit against Facebook, which alleges the tech giant broke federal antitrust laws. Slaughter has called for tougher enforcement on Silicon Valley and swifter action from the government watchdog so that other tech companies are deterred from committing privacy or competition missteps in the future.
“The threats to consumer privacy are growing. They impact our most vulnerable citizens more than most, and they demand new solutions,” Slaughter said in a September 2019 speech that illustrated her views about the agency’s responsibility to penalize wrongdoers. “My hope is that the ‘near future’ brings renewed action on this front across the board, from the FTC, Congress, advocates and industry, and I feel both humbled and privileged to get to take part in this effort.”
Rosenworcel has served as a Democratic commissioner at the FCC for the past 8 years. She is a fervent supporter of net neutrality and has called for swift action to address the digital divide.
““If you want evidence this is not right, it’s all around us,” Rosenworcel said earlier this month. “There are people sitting in parking lots using free Wi-Fi signals because they have no other way to get online. There are students who fall in the homework gap because the lack the high-speed service they need to participate in remote learning.”
Facebook and Google ran advertisements of merchandise for extremist groups, despite bans.
Merchandise associated with the Three Percenters, a far-right group, was advertised on the two platforms, Jeremy B. Merrill reports. Spokespeople for the two companies told the Markup that the ads violated their rules and said they had been removed. But the ads raise questions about the companies’ methods for finding rule-violating ads. One Google ad appeared as recently as Jan. 8, two days after the Capitol riot in Washington where Three Percenters patches were spotted.
“This is a perfect example of the ways in which large technology companies are directly enabling violent extremist organizations,” Senate Intelligence Committee Chairman Mark R. Warner (D-Va.) told the outlet, while noting that the issue “has received insufficient attention from policymakers.”
Workforce report
Privacy monitor
Trending
Daybook
– Jan. 26: The libertarian Cato Institute hosts a discussion on Section 230.
– Jan. 26-27: Sen. Amy Klobuchar (D-Minn.), Rep. Anna G. Eshoo (D-Calif.) and Rep. John Katko (R-N.Y.) speak at State of the Net’s annual Internet policy conference.
– Jan. 27: Bill and Melinda Gates discuss the coronavirus pandemic.
Before you log off
Stephen Colbert has a new drama to keep us entertained during the pandemic: