Meta, Facebook to drop fact-checkers: What does this mean for social media?

Meta, the owner of Facebook and other social media platforms, will implement major changes to its content moderation policies, founder Mark Zuckerberg announced this week in a video titled, “More speech and fewer mistakes”.

Among the changes, Meta’s use of fact-checking organisations will end abolished and the group will switch to a system of community notes – similar to those used by the X platform – instead.

The move, revealed on Tuesday, comes as tech executives brace for the arrival of incoming US President Donald Trump, whose right-wing supporters have long decried online content moderation as a tool of censorship.

So why is this happening now and will it lead to more misinformation?

What did Zuckerberg announce about Meta this week and why?

In a video posted to social media platforms, Zuckerberg explained that Meta plans to scrap fact-checking in favour of a new system of community notes which users can use to identify posts of others that may have misleading or falsified information. Meta plans to roll this community note system out in the next coming months.

“Our system attached real consequences in the form of intrusive labels and reduced distribution. A programme intended to inform too often became a tool to censor.”

While this policy will extend to all subject matters, Zuckerberg singled out the issues of “gender and immigration” in particular.

Is Meta also moving operations to Texas? Why?

Meta plans to relocate its content moderation teams from California to Texas, hoping the move will “help us build trust” while having “less concern about the bias of our teams”. Some experts see the move as politically motivated and could have negative implications on how political content is handled on Meta’s platforms.

“This decision to move to Texas is born out of both some practicality and also some political motivation,” stated Samuel Woolley, the founder and former director of propaganda research at the University of Texas at Austin’s Center for Media Engagement who spoke to digital newsgroup The Texas Tribune.

“The perception of California in the United States and among those in the incoming [presidential] administration is very different than the perception of Texas,” he added.

Zuckerberg appears to be following in the footsteps of Musk, who shifted Tesla’s headquarters to Austin, Texas in 2021. In an X post in July, Musk also expressed interest in moving his other ventures, X and SpaceX, from California to Texas, citing Governor Gavin Newsom’s recently enacted SAFETY Act which prevents schools from mandating teachers to notify parents when a student requests to be recognised by a “gender identity” that differs from their sex.

How has content moderation on Meta platforms worked until now?

Presently, social media platforms like Facebook and Threads use third-party fact-checking organisations to verify the authenticity and accuracy of content posted to each platform.

These organisations evaluate content and flag misinformation for further scrutiny. When a fact-checker determines a piece of content to be false, Meta will take action to substantially limit that piece of content’s reach, ensuring that it reaches a significantly smaller audience. However, third-party fact-checkers do not have the authority to delete content, suspend accounts or remove pages from the platform. Only Meta can remove content from its platforms that violates its Community Standards and Ads policies. This includes, but is not limited to, hate speech, fraudulent accounts and terrorist-related material.

Related Articles

Back to top button