Facebook is building an oversight board, Can that fix its problems?- iWONDER

  25 June 2019    Read: 2208
 Facebook is building an oversight board, Can that fix its problems?-  iWONDER

On a recent Wednesday afternoon in late May, roughly 30 Facebook Inc. employees gathered at the company’s Menlo Park, California, headquarters to talk about sexual harassment.

The group was there to consider a single, controversial Facebook post: an unsubstantiated list of more than 70 academics accused of predatory behavior, which also encouraged people to submit more “sexual harassers” to the list. The Facebook employees were asked to decide: Should the post remain up?

The reality is the group had no authority to determine the post’s fate – that had been decided years ago by Facebook’s content moderators, who decided to leave it up. The employees were instead gathered for a role-playing exercise, the latest in a series of simulations Facebook is running globally on its way to creating a new Content Oversight Board that will review controversial decisions made by the company’s content moderators. If someone believes their post was removed in error, or the general public takes issue with a post that was allowed to remain, the board may step in and provide a final ruling. The list of creepy academics is the kind of post the board may one day review.

For more than two hours, the group grappled with the list, taking notes on floor-to-ceiling whiteboards. Were the allegations credible? How many people saw the post? How many people reported it? What did Facebook’s content policies stipulate?

One employee posed a question to the group right before they adjourned. “These are evolving situations, right?” said the employee, who Bloomberg agreed to keep anonymous as part of observing the session. “[Pretend] one week later, two weeks later, someone on that list commits suicide. A week later another person commits suicide. Do we take it down? Do we say, no, we decided to keep it up?’” In the end, the group voted overwhelmingly that the list should remain up – 22 votes in favor, 4 against – though few employees seemed fully convicted in their decision.

In a world where Facebook is deemed much too powerful, and where the company is constantly criticized by some for taking down too much, and by others for taking down too little, the new Oversight Board represents a potential solution to one of Facebook’s thorniest problems: Its control over global speech. This new board, which doesn’t yet exist, will make content decisions for a global network of 2.4 billion people, making it a de-facto Free Speech Supreme Court for one of the biggest communities on the internet.

It undoubtedly comes with challenges. The board’s independence will most certainly be an issue, and it’s unlikely the board will move at the speed necessary to keep up with the internet’s viral tendencies. But Facebook is on an elaborate listening tour in hopes of turning this Supreme Court vision into a reality that people can trust.

The idea for Facebook’s Supreme Court originated with Noah Feldman, an author and Harvard law professor who pitched the concept of a “Supreme Court of Facebook” to Chief Operating Officer Sheryl Sandberg in January 2018. (Feldman is also a columnist for Bloomberg Opinion.) Feldman’s pitch outlined the need for an independent, transparent committee to help regulate the company’s content decisions. It was passed along to Zuckerberg, and Facebook ultimately hired Feldman to write a white paper about the idea and stay on as an adviser. The first time the idea was floated publicly was on a podcast that Zuckerberg did with Vox’s Ezra Klein, where he mentioned the idea for an independent appeals process “almost like a Supreme Court.” 

It's been more than a year since that podcast, and more than seven months since Zuckerberg formally announced plans to build an Oversight Board, and the company is still trying to agree on its fundamental structure. Basic decisions like how many members it should have, how those members should be picked, and how many posts the board will review, are all still undecided. 

Facebook’s tentative plan is outlined in a draft charter. The company will create a global 40-person board made up of people appointed by Facebook. It’s unclear how many content cases the board will review, though Facebook envisions each case will be reviewed by 3 to 5 members. Once a decision is made, it’s final, and the ruling board members will then write a public explanation, and could even suggest that Facebook tweak its policies.

About the only thing that has been decided is that the board should be independent. Critics have slammed Facebook for having too much control over what people are allowed to share online. For years, conservative politicians and media personalities have accused the company of bias against conservative ideas and opinions. Facebook co-founder Chris Hughes criticized Zuckerberg’s power in a recent New York Times op-ed, saying that it was “unprecedented and un-American.”

The board is intended to take some of that power. Zuckerberg has promisedthese decision makers will be free of influence by Facebook and its leaders – though getting to true independence will be the company’s first big challenge.  

“It’s all well and good for people on the outside to kind of prescribe that, yeah, Facebook needs to cede some of its power to outsiders,” said Nate Persily, a Stanford law professor and expert in election law. “But when you start unpacking how to do that, it becomes extremely complicated very fast.”

Persily has already seen a version of the board come together. At Stanford, he just completed a two-month course with a dozen law school students who created their own version of the Facebook Oversight Board. The class presented their findings to Facebook employees at the end of May, suggesting that the board be much larger than the 40 part-time members Facebook outlined in its draft charter.

“If they’re going to do any reasonable slice of the cases that are going to go through the appeals process, it’s going to have to be much larger or it’s going to have to be full-time,” Persily said.

These kind of suggestions are why Facebook says it’s been running these simulations with academics, researchers and employees all over the world. Each serves as an elaborate survey. Since the start of the year, Facebook has hosted board simulations in Nairobi, Mexico City, Delhi, New York City, and Singapore.

It also opened the process to public feedback. During a recent open comment period, Facebook received more than 1,200 proposals from outside individuals and organizations with recommendations on what the company should build. Responses came from established groups like the media advocacy organization Free Press, and also concerned individuals from Argentina, France and Israel. Others like the Electronic Frontier Foundation and the Bonavero Institute of Human Rights at Oxford, provided Facebook with input through their own papers and blog posts.

The Bonavero Institute summarized its suggestions in a 13-page report, which included everything from different ways Facebook could pick cases for the board to review, to recommendations on how the board should be compensated. Both the EFF and the Bonavero Institute hammered home the importance of keeping the board independent.

“But our biggest concern is that social media councils will end up either legitimating a profoundly broken system (while doing too little to fix it) or becoming a kind of global speech police, setting standards for what is and is not allowed online whether or not that content is legal,” Corynne McSherry, EFF’s legal director, wrote on its blog. “We are hard-pressed to decide which is worse.”

Facebook is expected to publicly release a new report with findings from its simulations later this week.

Achieving real independence will be tricky given Facebook plans to appoint the initial board members, who will serve three-year terms. It will also pay them, though through a trust. Then the plan is for the board to self-select its replacement members as terms expire. The idea is that, while Facebook may appoint the initial group, future generations of the board will be free of Facebook’s influence.

“It isn’t just the people who we’re picking, but the process in which we’re picking them,” said McKenzie Thomas, a Facebook program manager helping lead the Oversight Board project. She emphasized the importance of having the board self-select its own replacement members as a key element of its independence. “This is a starting off point,” she added.

Then there’s the speed problem. It’s unrealistic to expect that the board’s decisions will happen with the speed necessary to police the internet. That means the board will likely serve more as a post-mortem – a way to review decisions that have already been made, and if needed, issue a ruling that could impact how future posts are handled by moderators.

It won’t, however, be a very efficient way to police Facebook in the moment, which is when content can usually cause the most damage. Facebook’s virality can mean that troubling content reaches millions of people in a matter of hours, if not minutes. The board won’t be necessary to make decisions on extreme violence, like the shooter who livestreamed his killing spree in New Zealand. Facebook already has strict policies in place for that kind of material. But borderline content, like deciding whether a post includes hate speech or just a strong opinion, could remain up for weeks until the board gets to it.

“One of the things we need to figure out is…what is a version of a more urgent [board] session?” said Brent Harris, director of governance and global affairs at Facebook. “Does that make sense, and what does that look like?”

Kate Klonick, a professor at St. John’s Law School, has written extensively about free speech, including an op-ed about Facebook’s oversight board in the New York Times. She’s observing the board’s creation for a law journal article she’s writing, and already spent one week embedded with the company.

The board, she says, may not move quick enough to solve all of Facebook’s content problems, but at least it should provide an outside voice so that Facebook alone isn’t responsible for free speech rules online.  “I see this [oversight board] as a solution for maybe that problem,” she added, “and unfortunately, not for the problem of the outrage machine.”

In a best case scenario, Klonick thinks Facebook’s oversight board could inspire similar organizations at other private companies. But she’s also prepared for an alternative outcome.

“Part of me is terrified [and] totally not delusional to the fact that...there’s just a really big chance that this just flops,” she said.

 

The Bloomberg


More about: Facebook  


News Line