The body created by Facebook to review content moderation decisions warned Thursday that user-generated fact-checks could harm people living under repression or conflict if they are introduced worldwide.
Facebook parent Meta announced last year that it would end its use of external fact-checkers in the US.
That scheme had employed third parties including AFP to expose misinformation.
Instead, Meta said it would ask ordinary users to verify controversial claims in a system known as "community notes", aping methods on X and other social networks.
If rolled out worldwide, that scheme "could... pose significant human rights risks and contribute to tangible harms," Meta's Oversight Board said in a Thursday advisory.
That was especially true in "repressive human rights regimes, in particular electoral contexts and in ongoing crisis and conflict situations," it added.
AFP was one of 23 organisations whose public comments were accepted by the Oversight Board as it prepared its advisory.
The independent board is often referred to as Instagram and WhatsApp owner Meta's "supreme court", ruling on moderation decisions and advising on policy.
Created and voted on by ordinary social media users, community fact-checks generally rely on independent journalism to back up their claims.
This is difficult or impossible in repressive regimes, the board noted.
During conflicts, some groups may be cut off from access and unable to weigh in with their side of the story, they added.
The board recommended that community notes should not be introduced where there is active fighting or widespread obstacles to getting online.
Free media and civil society are also needed for ordinary people to fact-check claims in the midst of elections.
Without them, "the program risks publishing misleading notes", the board said.
And in some parts of the world, "malicious actors have repeatedly demonstrated the ability to coordinate large numbers of accounts to promote deceptive information" and could do so via Meta's community notes, it added.
"This risk will become more acute as artificial intelligence facilitates the scaled creation and operation of these networks," the board warned, suggesting that Meta rule out countries with histories of disinformation campaigns.
Other factors to take into account included language barriers and political polarisation.
The board urged Meta to test for "risks related to contributor anonymity, coordinated disinformation campaigns and gaming of the system, language representation and contributor participation" before launching community notes in a country.
It should also grant outside researchers access to data on the scheme.
tgb/st
Keep reading
Italian authorities said Thursday they had seized 20 million euros of assets in Tuscany, including property, vineyards and olive groves, allegedly bought with money embezzled from Bond Girl actress Ursula Andress.
The global trading system is experiencing the "worst disruptions in the past 80 years", World Trade Organization chief Ngozi Okonjo-Iweala warned as the WTO ministerial conference opened Thursday.
The European Union accused four pornographic platforms on Thursday of allowing children to access adult content in breach of digital rules, putting the companies at risk of large fines.
(0) comments
Welcome to the discussion.
Log In
Post a comment as Guest
Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
PLEASE TURN OFF YOUR CAPS LOCK.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.