Uncategorized

Meta users are not being left to wade through hate and misinformation

A Speed-Up in the Way of False Information: Insights from a New Facebook Fact-Checking Campaign Against The “Trump Campaign”

Meta has completely rolled back the fact checking and moderation policies that were put in place after it was revealed that the company was involved in influence operations that were designed to influence elections and promote violence.

Zuckerberg blamed the “legacy media” for forcing Facebook to implement moderation policies after the 2016 election. “After Trump first got elected in 2016 the legacy media wrote non-stop about how misinformation was a threat to democracy,” Zuckerberg said. “We tried, in good faith, to address those concerns without becoming arbiters of truth, but the fact checkers have just been too politically biased and have destroyed more trust than they’ve created,”

Jesse Stiller, managing editor of Meta fact-checking partner Check Your Fact, tells WIRED that they were not aware of it. The organization has 10 people working in the newsroom. “This was totally unexpected and out of left field for us. We weren’t aware this decision was being considered until Mark dropped the video overnight.”

Poynter owns PolitiFact, which is one of the fact-checking partners Meta works with in the US. Holan had worked as an editor at PolitiFact before she was promoted to the position at IFCN. What makes the fact-checking program effective is that it serves as a “speed bump in the way of false information,” Holan says. Content that is flagged typically has a screen placed over it that tells users if fact-checking found the claim questionable and if they still want to view it.

That process covers a broad range of topics, from false information about celebrities dying to claims about miracle cures, Holan notes. The program was launched in 2016 due to public concern that social media could amplify false rumors, like the pope endorsing Donald Trump.

Changing the Metaverse: Donald Trump’s 2020 Decree to the Bottom has a Bad Detection, says Michael Khoo

The announcement is a bending of the knee to Trump, and an attempt to catch up to Musk in his race to the bottom. The CEO of the American Sunlight Project and a Syracuse University professor said that the implications are going to be widespread.

Twitter launched its community moderation program, called Birdwatch at the time, in 2021, before Musk took over. Musk, who helped bankroll Trump’s campaign and is now set to lead the incoming administration’s new “Department of Government Efficiency,” leaned into Community Notes after slashing the teams responsible for content moderation at Twitter. Hate speech — including slurs against Black and transgender people — increased on the platform after Musk bought the company, according to research by the Center for Countering Digital Hate. (Musk then sued the center, but a federal judge dismissed the case last year.)

Today, Meta says it will get rid of a lot of restrictions on topics like immigration, gender identity and gender that are the subject of frequent political discourse and debate.

Scientists and environmental groups are wary of the changes at Meta, too. Kate Cell, senior climate campaign manager at the Union of Conscious Scientists, said in an email that the decision by Mark Zuckerberg to abandon efforts to check facts and correct misinformation will make it more difficult for anti-scientific content to stay on Meta platforms.

“I think this is a terrible decision … disinformation’s effects on our policies have become more and more obvious,” says Michael Khoo, a climate disinformation program director at Friends of the Earth. He suggests attacks on wind power are an example.

Khoo also likens the Community Notes approach to the fossil fuel industry’s marketing of recycling as a solution to plastic waste. In reality, recycling has done little to stem the tide of plastic pollution flooding into the environment since the material is difficult to rehash and many plastic products are not really recyclable. The strategy also puts the onus on consumers to deal with a company’s waste. The problem of disinformation that the tech companies are creating needs to be owned by them.

The Impact of Meta’s New Policies on Political Issues and Culture Wars in the United States, and Lead Stories in the Era of the 2016 Elections

In a posting on Meta’s official website, the chief global affairs officer said the decision was made to allow more topics to be discussed on the company’s platforms. The change will affect the company in the US.

Kaplan said that they would allow more speech by removing restrictions on certain topics that are included in mainstream discourse and focusing enforcement on illegal and high-severity violations.

In a video accompanying the blog post, Meta CEO Mark Zuckerberg said the new policies would see more political content returning to people’s feeds as well as posts on other issues that have inflamed the culture wars in the US in recent years.

Ahead of last year’s high profile elections across the globe, Meta was criticized for taking a hands-off approach to content moderation related to those votes.

Over time we ended up having too much content being fact checked that people could understand to be legitimate political speech and debate, Kaplan wrote.

WIRED reported last year that dangerous content had flourished on the platform, while anti-government militias had used Facebook to recruit new members.

In an attempt to remove bias, the trust and safety team of Meta would be moving from California to Texas, which is also home to X. In places with less concern about the bias of our teams, I think it will help us build trust and promote free expression.

Lead Stories has a diverse revenue stream and most of their operations are outside of the US, but Duke believes the decision will have an impact on them. “The most painful part of this is losing some very good, experienced journalists, who will no longer be paid to research false claims found on Meta platforms,” Duke says.

The news organizations who had worked with Meta to fight misinformation are scrambling to figure out how this change will affect them.

Meta partners with dozens of fact-checking organizations and newsrooms across the globe, 10 of which are based in the US, where Meta’s new rules will first be applied.

The news that Meta was no longer planning on using their services was announced in a post written by an officer of the company. X-style Community Notes will allow users to flag content that they think is inaccurate or requires further explanation.

Alan Duke, the founder and editor in chief of fact-checking site Lead Stories, says they heard the news the same as everyone else. “No advance notice.”