Translate

Facebook’s Oversight Board upheld the social network’s decision to ban former president Trump four months after Capitol riot



Long-awaited ruling has implications for how social media companies govern speech by public officials and other powerful people

Facebook CEO Mark Zuckerberg testifies before a House Financial Services Committee hearing on Capitol Hill in Washington in 2019. (Andrew Harnik/AP)
By
Elizabeth Dwoskin and
Cat Zakrzewski (TWP)

Facebook’s Oversight Board on Wednesday upheld the social network’s decision to ban former president Trump four months after Capitol riot, but also gave Facebook six months to review the decision.

Facebook banned Trump indefinitely following the Jan. 6 attack on the U.S. Capitol, citing posts that it said encouraged violence. The binding ruling by the 20-member Oversight Board, which is largely independent and funded by the social network, could set the stage for a new political era online, reshaping the way speech by public officials and other powerful people is moderated by social media companies.

“The Board has upheld Facebook’s decision on January 7, 2021, to restrict then-President Donald Trump’s access to posting content on his Facebook page and Instagram account,” the board wrote. “However, it was not appropriate for Facebook to impose an ‘indefinite’ suspension. It is not permissible for Facebook to keep a user off the platform for an undefined period, with no criteria for when or whether the account will be restored.”

Critics have argued that Facebook should have banned Trump at different points throughout his presidency, saying that his inflammatory language and frequent promotion of misinformation — about the coronavirus in particular — constituted an abuse of his office and of Facebook’s own community standards. But chief executive Mark Zuckerberg felt strongly that politicians should be given wide latitude because their speech was in the public interest.

The last straw came on Jan. 6, when Trump’s comments on Twitter appeared to encourage the Capitol insurrection, Zuckerberg said, and the company said it would suspend him indefinitely.

Facebook referred its decision about Trump to the Oversight Board shortly afterward. The board, which is less than a year old and had yet to decide a case at the time, was first conceived by Zuckerberg as a way to outsource the thorniest content moderation decisions without having the government intervene.

Over the past few months, members spanning time zones from Taiwan to San Francisco connected on videoconference calls to pore over more than 9,000 public comments on the matter, including from Trump himself, according to people familiar with the board who spoke on the condition of anonymity to discuss the process.

The much-anticipated decision by the Oversight Board is an experiment in the policing of political speech online — and the first major test of a new system Facebook put in place to essentially front-run the possibility of the government stepping in to play that role, experts say.

“This doesn’t begin and end with Donald Trump,” said Nathaniel Persily, a Stanford Law School professor. “They’ve got all kinds of elections coming up around the world.”

If the board is viewed as a success, it could also become a template for new laws governing social media companies, said Rep. Ro Khanna (D-Calif.). He said that Congress should consider requiring social media companies of a certain size to have their own independent board focused on these decisions.

“You could see Congress requiring that kind of regulation in social media companies, recognizing that there is a public dimension to the digital town halls that they’ve created,” he said.

Under U.S. law, social media platforms are not held legally responsible for policing unwanted or even much illegal content on their services, with some exceptions for copyright issues and child pornography. But in recent years, Silicon Valley has dealt with a series of crises over enabling disinformation and the spread of extremism from both domestic and international forces, and the blowback has forced the companies to invest significantly in content moderation. That investment picked up in 2020, when the companies launched stronger policies aimed at combating misinformation surrounding the election and the coronavirus.

The crises have also led to new regulatory scrutiny around the world — especially in Washington, where Democrats have promised to use their new powers to update existing antitrust laws, crack down on misinformation and pass federal privacy legislation. The Oversight Board’s decision comes as Facebook is the target of a landmark Federal Trade Commission lawsuit, which focuses on the company’s practice of buying up rivals.

Zuckerberg and Facebook executives publicly floated the idea of creating the Oversight Board in 2018 as lawmakers around the world mulled new ways to regulate Facebook. The company faced broad criticism that it lacked accountability for content decisions that had wide-reaching social consequences, and that there were no checks on Zuckerberg’s power to determine what could be said on a service that had become the public square for billions of people.

“You can imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world,” Zuckerberg told Vox in a 2018 interview.

Facebook then embarked on a months-long process collecting feedback on how to design the board and consulting more than 2,000 people in 88 countries. It released the rules and selected its first members in 2020. The board was a lightning rod for controversy during its formation, as Facebook’s critics warned its authority was too limited and that the company’s role in picking board members compromised its independence.

The board issued its first decisions in late January, a week after Facebook announced it would refer the high-profile Trump case. The initial round of decisions — which touched on alleged hate speech, coronavirus misinformation and references to dangerous organizations — signaled that the board would demand greater clarity from Facebook about its policies, as well as transparency. Before Wednesday’s decision, the board had overturned Facebook’s decisions six times, upheld them twice, and was unable to complete a ruling once.

No comments:

Post a Comment