Meta would like to introduce the following fact checkers: It detects falsehoods, makes convincing corrections, and warns others about misleading content.
It's you.
Meta CEO Mark Zuckerberg announced Tuesday that the company is ending many of its moderation efforts, including third-party fact-checking and content restrictions. Instead, he said, the company plans to hand over fact-checking duties to the public under a model called Community Notes. This model, popularized by X, allows users to leave fact-checks and corrections on social media posts.
The announcement signals the end of an era of content moderation and the adoption of looser guidelines that Zuckerberg acknowledged would increase the amount of false and misleading content on the world's largest social network.
“This would be a spectacular failure,” said Alex Mahadevan, director of the Poynter Institute's media literacy program called MediaWise, who has studied community notes on X. . Users can hold themselves accountable. ”
Such a development would have been unimaginable after the 2016 and 2020 presidential elections, when social media companies saw themselves as passive warriors on the front lines of the war on misinformation. Falsehoods spread during the 2016 presidential election sparked a public backlash and internal debate over social media companies' role in spreading so-called fake news.
Companies are spending millions of dollars on content moderation efforts, paying third-party fact checkers, creating complex algorithms to limit harmful content, and issuing warnings to slow the spread of falsehoods. We responded by publishing labels one after another. These measures were considered necessary to restore public confidence.
This effort was successful to some extent. The fact-checker label was effective at reducing beliefs in falsehoods, but less so for conservative Americans, the researchers found. But the effort also made the platform, and Mr. Zuckerberg in particular, a political target for President-elect Donald J. Trump and his allies, who argue that content moderation is nothing more than censorship.
Now, the political environment has changed. As Trump seeks to take control of the White House and regulators overseeing meth, Zuckerberg pivots to repairing relations with Trump, dining at Mar-a-Lago and meeting with meth's board members. Trump's allies were added to the group. And he donated $1 million to Trump's inaugural fund.
“The recent election feels like a cultural tipping point to re-prioritize speech,” Zuckerberg said in a video announcing the host change.
Mr. Zuckerberg's bet to use Community Notes instead of professional fact-checkers is similar to a similar one at Company X, where billionaire owner Elon Musk allowed the company's fact-checking to be outsourced to users. It was inspired by an experiment.
X is currently asking the public to identify falsehoods and write corrections or add additional information to social media posts. The exact details of Meta's program are unclear, but in X, notes will initially be visible only to users who sign up for the community notes program. If a note receives enough votes to be considered valuable, it will be added to a social media post for everyone to see.
“The dream for social media platforms is fully automated moderation, where one, you don't have to be responsible, and two, you don't have to pay anyone,” said MediaWise director Mahadevan. Ta. “So Community Notes is an absolute dream for these people. They've been trying to design a system that basically automates fact checking.”
Musk, another Trump ally, was an early advocate of community notes. He laid off most of the company's trust and safety team and quickly ramped up the program.
Research shows that community notes are effective in dispelling some of the falsehoods of the virus. The researchers found that this approach was most effective for topics where there was broad consensus, such as misinformation about coronavirus vaccines.
In that case, the memo “emerged as an innovative solution to push back on accurate and reliable health information,” said John W. Ayers, associate director of innovation in the University of California's Division of Infectious Diseases and Global Public Health. . San Diego School of Medicine, he authored a report on the subject in April.
But misleading posts on politically divisive subjects often go unchecked, as users with different political views must agree to fact-checking before they can be publicly added to a post. MediaWise research shows that less than 10 percent of user-created community notes end up being published. For sensitive topics like immigration and abortion, this number is even lower.
Researchers found that while the majority of X posts receive most of their traffic within the first few hours, it can take several days for community notes to be approved and available for everyone to see. I discovered.
Since its debut in 2021, the program has sparked interest from other platforms. YouTube announced last year that it would launch a pilot project that would allow users to post notes under misleading videos. The usefulness of these fact checks is still being evaluated by third-party evaluators, YouTube said in a blog post.
Although Meta's existing content management tools seemed overwhelmed by a flood of false and misleading content, the researchers believed this intervention was quite effective. A study published last year in the journal Nature Human Behavior found that using warning labels, like those used by Facebook to warn users about false information, reduced beliefs in falsehoods by 28 percent; They say their content is being shared 25 percent less frequently. The researchers found that right-wing users were far more distrustful of fact-checking, but the intervention was still effective in reducing beliefs about false content.
“All the research shows that the more speed bumps there are, the more friction there is essentially on the platform and the less dissemination of low-quality information,” said Clare Wardle, associate professor of communication at Cornell University. “
Researchers believe that community fact-checking can be effective when combined with internal content moderation efforts. But Mehta's hands-off approach could prove dangerous.
“Community-based approaches are one piece of the puzzle,” says Valerie Wilshafter, a fellow at the Brookings Institution who has studied community notes. “But it can't be the only thing, and it can't just be rolled out like an untailored holistic solution.”