Meta CEO Mark Zuckerberg announced a series of major changes to the company’s moderation policies and practices Tuesday, citing a shifting political and social landscape and a desire to embrace free speech, writes NBC NEWS.
Zuckerberg said that Meta will end its fact-checking program with trusted partners and replace it with a community-driven system similar to X’s Community Notes.
The company is also making changes to its content moderation policies around political topics and undoing changes that reduced the amount of political content in user feeds, Zuckerberg said. The changes will affect Facebook and Instagram.
“We’re going to get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms. More specifically, here’s what we’re going to do. First, we’re going to get rid of fact checkers and replace them with community notes similar to X, starting in the U.S”, - Zuckerberg said in a video.
He pointed to the election as a major influence on the company’s decision and criticized “governments and legacy media” for allegedly pushing “to censor more and more”: “The recent elections also feel like a cultural tipping point towards, once again, prioritizing speech. So we’re gonna get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms… We built a lot of complex systems to moderate content, but the problem with complex systems is they make mistakes. Even if they accidentally censor just 1% of posts, that’s millions of people, and we’ve reached a point where it’s just too many mistakes and too much censorship”.
Beyond the end of the facet-checking program, Zuckerberg said the company will be eliminating some content policies around hot-button issues including immigration and gender, and refocus the company’s automated moderation systems on what he called “high severity violations” and rely on users to report other violations.
Facebook will also be moving its trust and safety and content moderation team from California to Texas: “We’re also going to tune our content filters to require much higher confidence before taking down content. The reality is that this is a trade off. It means we’re going to catch less bad stuff, but we"ll also reduce the number of innocent people’s posts and accounts that we accidentally take down”.