Facebook on Tuesday unveiled its blueprint for an independent oversight board to review the company's decisions about the posts, photos and videos it takes down or leaves online, responding to a wave of criticism that inconsistent policies have undermined the platform.
The roughly 40-person panel is supposed to function as the social media giant's version of a "Supreme Court," serving as the final word for Facebook users who want to appeal the company's moderation decisions. It will also offer recommendations for how the tech giant should tackle problematic content in the future.
"We are responsible for enforcing our policies every day and we make millions of content decisions every week," chief executive Mark Zuckerberg said in a post. "But ultimately I don't believe private companies like ours should be making so many important decisions about speech on our own."
In shifting responsibility away from top executives and engineers, however, Facebook may be able to distance itself from the criticism it has faced over its decisions, which have fueled calls for government regulation.
While the Oversight Board detailed Tuesday is still in its early stages, its ultimate test will be its ability to navigate and interpret Facebook's thicket of rules to reach decisions in real time - while revealing more about how Facebook actually comes to its conclusions about content in the first place. That means figuring out the ever-elusive line between free expression and harmful speech and serving as a court of sorts for a global network that has different needs and varying visions for what the Web should look like.
The release of the charter comes a day before Facebook will join its peers from Google and Twitter at a hearing on Capitol Hill to probe how social media contributes to real-world violence. For years, lawmakers have pressured Silicon Valley to take a more proactive role to stop the spread of white supremacy, detect violent threats in real time and combat falsehoods, including manipulated online video - and Facebook announced Tuesday it had tightened its rules and tools to spot and remove hate speech.
Facebook has long maintained detailed policies to combat and remove harmful speech, including attacks on the basis of race or religion, terrorist propaganda and disinformation. But the company often has struggled to implement and enforce such rules uniformly, resulting at times in the viral spread of harmful content - or accusations that its executives and engineers are biased.
To help better navigate these and other political pressures globally, Zuckerberg in November first sketched out his vision for an "independent body" that would serve as a check on the human reviewers and artificial-intelligence tools that vet the posts uploaded by its community of 2.2 billion users.
The charter released Tuesday outlines new oversight at Facebook meant to address allegations of unfairness, on a global scale. The company aims to have a board of "likely" 40 members, representing different regions of the world, each serving for a three-year term. Facebook said Tuesday it intends to select a few members to start. Those members will then choose the remaining members, all of whom will be overseen by an independent trust Facebook plans to establish to handle logistical matters such as the budget.
The roster of members has not yet been announced. But it is likely to be one of the most closely watched elements of the entire endeavor, given the difficulty of establishing a truly globally representative body, said Kate Klonick, a fellow at The Information Society Project at Yale Law School who's studying Facebook's work.
"How do you pick one person from the U.S. who represents all of the U.S., and should there be one person from the U.S.?" asked Klonick. Still, she praised Facebook for a "massive commitment of resources" toward trying to figure it out.
Users who disagree with Facebook's content decisions can appeal to the company, and if they still don't like the resolution, may then appeal to the board. Facebook has committed that board decisions are binding and will be implemented quickly. The company said it'll also send cases to the board for automatic, expedited review if there's a potential for "urgent real world consequences," according to the charter released Tuesday. It expects to take its first cases from users in 2020.
Facebook's new review board will have the ability to recommend broader changes even beyond the content it has been asked to study, such as additional content takedowns or proposing new business practices. But it'll be up to the social-networking company to decide if and how it will implement them-though Facebook pledged to publish its detailed reasoning about the decision it makes.
"This charter brings us another step closer to establishing the board, but there is a lot of work still ahead," Zuckerberg said. "We expect the board will only hear a small number of cases at first, but over time we hope it will expand its scope and potentially include more companies across the industry as well."
The stakes grew after the deadly massacre in Christchurch, New Zealand, where some users uploaded videos of the attacks in ways that evaded tech companies' censors. Others in the U.S. faulted Facebook earlier this year for refusing to take down a manipulated video of House Speaker Nancy Pelosi that made it appear she was drunk.
Globally, regulators have issued an ultimatum to Facebook, Google and Twitter, threatening to hold the companies directly liable for the decisions they make and the content they allow online unless they improve their platforms. At the same time, though, the companies have also faced immense criticism for flagging and removing content they should not have. In the United States, Silicon Valley has faced additional attacks from Republicans, who claim their policies result in the suppression of conservatives' speech online.
(Except for the headline, this story has not been edited by NDTV staff and is published from a syndicated feed.)Track Latest News Live on NDTV.com and get news updates from India and around the world