Facebook and Instagram users could ask the board to review their cases .
Facebook on Tuesday unveiled more details about the likely workings of a new independent that’ll oversee content-moderation decisions, outlining a new appeals process users would go through to request an additional review of takedowns. Users of Facebook and its Instagram photo service can ask the board to review their case after appealing to the social network first. You’ll have 15 days to fill out a form on the board’s website after Facebook’s decision.
Requesting a review by the board doesn’t mean the body will automatically hear your case. You’ll have to explain why you think Facebook made the wrong call and why the board should weigh in. You’ll also have to spell out why you posted the content and explain why Facebook’s decision could affect other users.
Facebook will also get to refer cases to the board. Users will receive a notice explaining whether the board decided to review a given case. The company explained this process in proposed bylaws that still need to be approved by the board.
The creation of the independent content-moderation board could help clarify how Facebook decides what posts it leaves up or pulls down and lead to policy changes. The additional appeals process might also help Facebook fend off critics, some of whom have alleged the company censorsand other groups. Facebook also faces criticism for allowing politicians to spread false information in .
“This board we’re creating is a means to hold ourselves accountable and provide for oversight on whether or not we’re making decisions that are principled according to the set of standards and values that we’ve set out,” Brent Harris, Facebook’s director of governance and global affairs, said during a conference call Tuesday.
The social network has rules about barring content, including hate speech, nudity and human trafficking. But users sometimes disagree with how those rules are applied. Facebook has reversed some of its decisions in the past, especially amid public scrutiny. In 2016, the company removed an iconic Vietnam War photo of a girl fleeing a napalm attack because it violated the social network’s rules on nudity. It reinstated the image amid an outcry, citing the image’s historical importance.
Facebook users typically receive a notification that includes an option to appeal when the company removes their content. If the appeal isn’t successful, users will now be able to ask the new board to review their case. Facebook will be required to reinstate removed content If the board sides with the user.
Users who submit an appeal will receive a reference identification number if their content is eligible for review by the board. Eligible content includes Facebook and Instagram posts, videos, photos and comments that the company took down. The process will eventually be expanded to groups, ads, events and other content, including information rated “false” by fact-checkers and content left up on the platform. Facebook didn’t specify when this would happen.
Facebook expects the board to make a decision and for the company to take action on the ruling in roughly 90 days.
Harris said he expects the board to initially review dozens of cases every year but noted that the decisions could impact Facebook’s 2.4 billion users, especially if the social network ends up changing its policies.
Users will also be able to choose if they want to include details that could identify them in the board’s final decision. On Tuesday, Ranking Digital Rights, a nonprofit that promotes freedom of expression and privacy, called on Facebook to provide more clarity, including how it’ll protect the privacy of users who don’t consent to releasing identifiable information. The board’s decision will be published on its website if approved for release.
Fay Johnson, a Facebook product manager who focuses on transparency and oversight, said the company is trying to make it clear to users that the board’s decisions will be public. “There really will be a value added to what the board speaks to, even if the specific information about the person who’s posting the content is not included in the draft decision,” she said.
Facebook also named Thomas Hughes, former executive director at Article 19, a nonprofit focused on freedom of expression and digital rights, to lead the board’s administrative staff. When Hughes led Article 19 in 2018, he called on Facebook to be more transparent about the content it removed and improve the appeals process for users.
“This is, as it goes without saying, an enormous undertaking and it will take us a few months before we are ready,” Hughes said.
The board is expected to be made up of 40 members and will likely start hearing cases this summer. Facebook announced in 2018 its plans to create a content oversight board.