Loading...

Facebook Corrections Found to Reduce Misinformation 

February 15, 2022 by Reece Nations
<strong></img>Facebook Corrections Found to Reduce Misinformation </strong>
An illustration of the Facebook logo, on May 9, 2016. Facebook won a court case in China against Zhongshan Pearl River Drink Factory for using the name face book. (Jaap Arriens/NurPhoto/Sipa USA)

WASHINGTON — Fact-checks added to posts on a Facebook news feed were found to reduce users’ acceptance of misinformation on the platform, according to a new study published in the Journal of Politics.

The study, conducted by assistant professors Ethan Porter of George Washington University and Thomas Wood of Ohio State University, simulated Facebook news feeds and administered a survey to gauge users’ accuracy in identifying misinformation. When the respondents were exposed to misinformation and subsequent fact-checks, the researchers found their ability to identify untrue posts improved.

The experiment was administered through two large and nationally representative population samples, and the respondents in one group were free to ignore the fact-checked information just as they could on the genuine platform. Porter told The Well News the findings were consistent among both conservative and liberal respondents.

“There have been widespread concerns about misinformation on Facebook and those concerns have prompted interest in potential solutions,” Porter said. “And one possible solution is fact-checking, correcting posts that are false on the platform and then maximizing the reach of those who see corrections. We wanted to know if such a solution would work at all on correcting false beliefs generated by misinformation.”

The study’s first phase subjected users to randomized, apolitical “placebo” content and multiple posts containing misinformation along with factual corrections. The falsified content contained information that had actually circulated around Facebook concerning fake claims about former President Donald Trump, climate activist Greta Thunberg and Rep. Ilhan Omar, D-Minn., among others.

Respondents were asked to deduce the truthfulness of the posts on a one-to-five scale without a fact-check. The second phase explored whether the respondents were more, less or as likely to believe the fake posts after they were blurred and flagged with corrected information.

Users’ belief in misinformation was found to consistently be reduced when confronted with a fact-check. Porter said that the outcomes of users’ exposure to fact-checks could have been an increase in accuracy, a decrease in accuracy known as the “backfire effect,” or no change either way.

“You might be especially concerned about that possibility when it comes to Facebook or social media, given that people simply have a lot on their plate at any one time,” he said. “They have a lot of stuff in their newsfeed, so it’s not clear that the corrections will make a difference. We didn’t observe either backfire or inertness. Instead, we observed people becoming more accurate as a result of this simulation of a Facebook news feed.”

Both Porter and Wood coauthored a separate study on the so-called backfire effect in December 2017 that described how some fact-checks could lead users to believe more in false information than they had previously. Conservative respondents in that study were found to be more convinced of the U.S. military’s justifications for the Iraq invasion when they were presented with information maintaining that there were no weapons of mass destruction present.

The findings of Porter’s latest study oppose previous research that found issuing corrections on social media was ineffective at reducing misinformation. Even when users could ignore the corrected information, the researchers found they trusted the fact-checks more often than not.

“What Facebook needs to do is make sure that people who see misinformation see fact-checks,” Porter told The Well News. “Right now, Facebook’s policy is actually quite bizarre. When people promote, share or write up misinformation — if that misinformation is widely trafficked — Facebook will underwrite the cost of a fact-check by an outside fact-checking organization.”

Porter continued, “But if you or I stumble upon that misinformation shared by somebody else in our newsfeed, Facebook does not compel us to then see the subsequent fact-check which is paid for. So, I think the fix is pretty straightforward. If you see misinformation that is later fact-checked, you should then see the fact-check after you log on to Facebook.”

Reece can be reached at reece@thewellnews.com

In The News

Health

Voting

Social Media

Thousands of Pro-Trump Bots Are Attacking DeSantis, Haley

WASHINGTON (AP) — Over the past 11 months, someone created thousands of fake, automated Twitter accounts — perhaps hundreds of... Read More

WASHINGTON (AP) — Over the past 11 months, someone created thousands of fake, automated Twitter accounts — perhaps hundreds of thousands of them — to offer a stream of praise for Donald Trump. Besides posting adoring words about the former president, the fake accounts ridiculed Trump's... Read More

February 6, 2023
by Dan McCue
Israeli Company Sues Meta Over Public Data Issue

NETANYA, Israel — Bright Data, one of the world’s leading web data platforms, is suing Meta, stating the owner of... Read More

NETANYA, Israel — Bright Data, one of the world’s leading web data platforms, is suing Meta, stating the owner of Facebook and Instagram is wrongly claiming public user data as its own, violating the spirit of openness on which the internet was founded. Meta, meanwhile, has... Read More

Twitter Poll Closes With Users Voting for Musk Exit as Chief

More than half of 17.5 million users who responded to a Twitter poll created by billionaire Elon Musk over whether... Read More

More than half of 17.5 million users who responded to a Twitter poll created by billionaire Elon Musk over whether he should step down as head of the company had voted yes by the time the poll closed Monday. There was no immediate announcement from Twitter,... Read More

Musk's Twitter Disbands Its Trust and Safety Advisory Group

Elon Musk's Twitter has dissolved its Trust and Safety Council, the advisory group of around 100 independent civil, human rights... Read More

Elon Musk's Twitter has dissolved its Trust and Safety Council, the advisory group of around 100 independent civil, human rights and other organizations that the company formed in 2016 to address hate speech, child exploitation, suicide, self-harm and other problems on the platform. The council had... Read More

November 23, 2022
by Dan McCue
White House Offers Up Thanksgiving ‘Talking Points’ to Get Through Thursday’s Dinner

WASHINGTON — Elon Musk may have invited former President Donald Trump back to Twitter, but it’s the Biden administration that... Read More

WASHINGTON — Elon Musk may have invited former President Donald Trump back to Twitter, but it’s the Biden administration that is taking full advantage of the social media platform for some pre-holiday messaging. In a Tweet posted shortly after 9 a.m. Wednesday, White House Chief of... Read More

November 10, 2022
by Dan McCue
FTC Taking Dim View of Musk Action Amid Twitter Turmoil

WASHINGTON — A letter posted to Twitter’s Slack feed by a company lawyer, detailing the utter chaos the social media... Read More

WASHINGTON — A letter posted to Twitter’s Slack feed by a company lawyer, detailing the utter chaos the social media outlet has descended into since Elon Musk took control of it, has prompted a rare public rebuke from the Federal Trade Commission. “We are tracking recent... Read More

News From The Well
Exit mobile version