Meta Platforms independent Oversight Board said on Thursday that Facebook should not have taken down a newspaper report about the Taliban that it considered positive, backing users’ freedom of expression and saying the tech company relied too heavily on automated moderation.
Meta found the post on the Taliban announcing that schools and colleges in Afghanistan for women and girls would reopen in March violated Facebook’s policies because it “praised” entities deemed to “engage in serious offline harms.”
The company limited the newspaper’s access to certain Facebook features after taking down the post.
The newspaper appealed the decision after which the post was referred to a special moderation queue, but was never reviewed, according to the Oversight Board.
The Oversight Board said Meta’s decision to remove the post was inconsistent with Facebook’s policies as they allow reporting on such organizations, and the company reversed its decision after the board selected the case.
“The Board found that Meta should better protect users’ freedom of expression when it comes to reporting on terrorist regimes,” the Oversight Board said.
“By using automated systems to remove content, Media Matching Service banks can amplify the impact of incorrect decisions by individual human reviewers,” it added.
Meta’s Oversight Board, which includes academics, rights experts and lawyers, was created by the company to rule on a small slice of thorny content moderation appeals, but it can also advise on site policies.