Techno

Meta wants X-style community notes to replace fact checkers

Chris Vallance

Senior technology correspondent

Getty Images Meta Owner Mark ZuckerbergGetty Images

Meta owner Mark Zuckerberg

As fires tore through large parts of Los Angeles this month, so did fake news.

Social media posts have described wild conspiracies around the fire, with users sharing misleading videos and misidentifying innocent people as hoaxes.

It has brought into sharp focus a question that has plagued social media for a lifetime: What is the best way to contain and correct the incendiary sparks of misinformation?

It’s a debate that Mark Zuckerberg, CEO of Meta, has been at the center of.

Shortly after the January 6, 2021, Capitol riots, fueled by false claims of a rigged US presidential election, Mr. Zuckerberg provided testimony to Congress. The billionaire bragged about his “industry-leading fact-checking software.”

He noted that he was recruiting 80 “independent third-party fact checkers” to curb misinformation on Facebook and Instagram.

Four years later, This system is no longer something to brag about.

“The auditors have actually been very politically minded and have destroyed more trust than they have created, especially in the United States,” Mr. Zuckerberg said earlier in January.

Taking their place, he said, would be something entirely different: a system inspired by “X.”Community feedback“, where users rather than experts issue the accuracy.

Many experts and fact-checkers have questioned Mr. Zuckerberg’s motives.

“Mark Zuckerberg was clearly gushing to the incoming administration and Elon Musk,” Alexios Mantzarlis, director of the Security, Trust and Safety Initiative at Cornell Tech, told the BBC.

Mr. Mantzarlis is highly critical of Ax’s decision from Ax Facts.

But like many experts, he also makes another point that may have been lost in the storm of meta-criticism it faces: that community-style systems, in principle, could be part of the solution to misinformation.

bird-watching

Adopting a fact checking system inspired by Elon-Musk’s proprietary platform has always lifted Hackles. The world’s richest man is regularly accused of using his X account to amplify misinformation and conspiracy theories.

But the system precedes its ownership.

“Birdwatch,” as it was known then, was started in 2021 and was inspired by Wikipedia, which she wrote and edited.

Meta Screenshot of Meta CEO Mark Zuckerberg announcing changes to fact checkingDescriptive screenshot

Mark Zuckerberg announced the changes in an online video

Like Wikipedia, Community Feedback relies on unpaid contributors to correct misinformation.

Contributors rate corrective feedback under false or misleading posts, and over time, some users gain the ability to write them. According to the platform, this group of contributors is now approximately 1 million.

This kind of system lets platforms “get more fact checks, more contributions, faster,” argues Mr. Mantzarlis — who once ran a fact-checking project himself from “crowd sales” — that kind of system lets platforms “get more fact checks, more contributions, faster.”

One of the main attractions of community-designed systems is their ability to scale: as a platform’s user base grows, so will a group of volunteer contributors (if you can convince them to participate).

According to X, community feedback generates hundreds of checks per day.

By contrast, Facebook’s expert fact-checkers may manage fewer than 10 per day, suggests an article by Jonathan Stray of the UC Berkeley Center for Human Compatibility Prosecution and journalist Eve Snyder.

And one The study suggests Community feedback can provide good quality checks: analysis of 205 reviews on Covid found 98% accurate.

A note appended to a misleading post can also cut its viral spread by more than half, retains X, and Research indicates It also increases the chance that the original poster will delete a tweet by 80%.

Keith Coleman, who oversees community feedback for X, argues that Meta is turning into a more capable fact-checking program.

“Community feedback actually covers a broader range of content than previous systems,” Lee said.

“This is rarely mentioned. I see stories that say ‘reality check show ends,'” he said.

“But I think the real story is, ‘Meta replaces existing fact-checking software with an approach that can scale to cover more content, respond faster and be trusted across the political spectrum.’”

Checkers facts check

But of course, Mr. Zuckerberg didn’t simply say that community feedback was a better system — he actively criticized fact checkers, accusing them of “bias.”

In doing so, he was echoing a long-held belief among American conservatives that big tech is censoring their views.

Others argue that fact checking inevitably monitors controversial viewpoints.

Silkie Carlo, director of the UK group Big Brother Watch – which has campaigned against the alleged censorship David Davis MP Posted by YouTube – BBC told allegations of significant tech bias across the political spectrum.

Centralized fact-checking through platforms risks “stifling valuable reporting on controversial content,” she told the BBC.

But Baybars Orsek, managing director of Logical Facts, which provides fact-checking services to META in the UK, argues professional fact-checkers can target the most serious misinformation and Identify emerging “harmful narratives.”.

Community-based systems alone lack the “consistency, objectivity, and expertise” to address the most harmful misinformation, he wrote.

Professional fact checking, many experts and researchers, Dispute allegations with the force of bias. Some argue the facts have simply lost the trust of many conservatives.

Mr Mantzarlis’ claims of trust were deliberately undermined.

“Truth checkers were really starting to become arbiters of truth in a big way that troubled revolutionaries and politically motivated people and suddenly, it was weaponized attacks,” he said.

Trust the algorithm

The solution that X is using in an attempt to maintain reliable community feedback across the political spectrum is to take a key part of the process out of the hands of humans, relying instead on an algorithm.

The algorithm is used to decide which notes are displayed, and also to ensure that they are found useful by a group of users.

In very simple terms, according to

The result, he says, is that the feedback is viewed favorably across the political spectrum. This is confirmed, according to X, by regular internal testing. Some are independent research Also supports this view.

Meta says Her community notes system It will require agreement between people with a range of viewpoints to help prevent biased ratings, “just like they do on X.”

But this broad acceptance is a high bar to reach.

Research indicates that more than 90% of suggested community feedback is never used.

This means that accurate notes may not be used.

But according to X, showing more notes would undermine the goal of only showing notes that would be found useful by most users and that would reduce trust in the system.

“More bad things”

Even after the facts are gone, Meta will still employ thousands of moderators who remove millions of pieces of content every day, such as graphic violence and child sexual exploitation material, that break the platform’s rules.

But Meta relaxes its rules on some politically divisive topics like gender and immigration.

Mark Zuckerberg admitted the changes, designed to reduce the risk of censorship, meant they were “Going to catch less bad stuff.”.

This, some experts argue, was the most important aspect of the Meta Declaration.

Co-chair of Meta’s supervisory board told the BBC There were ‘huge problems’ With what Mr. Zuckerberg did.

So what happens from here?

Details of Meta’s new plans to tackle misinformation are scarce. In principle, some experts believe that community feedback systems could be useful – but many also feel that they should not be a substitute for fact-checkers.

Community feedback is a “fundamentally legitimate approach”, He writes Professor Tom Stafford from the University of Sheffield, but platforms still need a professional fact-checker too, He believes.

“Crowd sourcing can be a useful component [an] The system moderates information, but it should not be the only component. ”

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button