Facebook has said it will go through an independent third-party audit to validate the numbers and metrics it publishes in its Community Standards Enforcement Report.
Facebook has said it will go through an independent third-party audit to validate the numbers and metrics it publishes in its Community Standards Enforcement Report.
The company is issuing a Request
For Proposal (RFP) to external auditors to conduct an independent audit of
these metrics.
"We hope to conduct
this audit starting in 2021 and have the auditors publish their assessments
once completed," said Vishwanath Sarang, Technical Programme Manager,
Integrity.
Facebook said it has been
working with auditors internally to assess how the metrics it reports can be
audited most effectively.
"Transparency is only
helpful if the information we share is useful and accurate. In the context of
the Community Standards Enforcement Report, that means the metrics we report
are based on sound methodology and accurately reflect what's happening on our platform,"
Sarang said in a blog post on Tuesday.
"We want people to be
confident that the numbers we report around harmful content are accurate, so we
will undergo an independent, third-party audit, starting in 2021, to validate
the numbers we publish in our Community Standards Enforcement Report,"
added Guy Rosen, VP Integrity at Facebook.
Over a year ago, Facebook
worked with international experts in measurement, statistics, law, economics
and governance to provide an independent, public assessment of whether the
metrics it shares in the enforcement report provide accurate and useful
measures of Facebook's content moderation challenges.
No comments:
Post a Comment