Friday, February 22, 2019

Why Facebook's effort to fight hate speech with external experts won't work


Facebook has long faced criticism for doing too little to block hate speech, incitements to violence, bullying and other types of content that violate its 'community standards'.

Facebook's new effort to bring outside experts into its content review process promises to be complicated and possibly contentious, if discussions this week at a meeting in Singapore are any indication.

Over the course of two days, 38 academics, non-profit officials and others from 15 Asian countries who were invited to a Facebook workshop wrestled with how a proposed "external oversight board" for content decisions might function.

The gathering, the first of a half-dozen planned for cities around the world, produced one clear recommendation: the new board must be empowered to weigh in not only on specific cases, but on the policies and processes behind them.


Facebook has long faced criticism for doing too little to block hate speech, incitements to violence, bullying and other types of content that violate its "community standards."
In Myanmar, for example, Facebook for years took little action while the platform was used to encourage violence against the Rohingya minority.

But the company also draws fire for not doing enough to defend free speech. Activists accuse the company of taking down posts and blocking accounts for political or business reasons, an allegation it denies.

Facebook CEO Mark Zuckerberg unveiled the idea of an independent oversight board last November and a draft charter was released in January.

"We want to find a way to strengthen due process and procedural fairness," Brent Harris, director of global affairs and governance at Facebook, said at the opening of the Singapore meeting. A Reuters reporter was invited to observe the proceedings on the condition that the names of participants and some details of the discussions not be disclosed.

Facebook's initial plan calls for a 40-person board that would function as a court of appeal on content decisions, with the power to issue binding rulings on specific cases.
But as attendees peppered Facebook officials with questions and worked through issues such as how the board would be chosen and how it would select cases, they repeatedly came back to questions of policy. Rulings on individual postings would mean little if they were not linked to the underlying content review procedures, many attendees said.

Hate speech policies were a big focus of discussion. Many attendees said they felt Facebook was often too lax and blind to local circumstances, but the company has held firm to the concept of a single set of global standards and a deliberate bias towards leaving content on the site.

No comments:

Post a Comment