Facebook Corrections Found to Reduce Misinformation 

February 15, 2022 by Reece Nations
<strong>Facebook Corrections Found to Reduce Misinformation </strong>
An illustration of the Facebook logo, on May 9, 2016. Facebook won a court case in China against Zhongshan Pearl River Drink Factory for using the name face book. (Jaap Arriens/NurPhoto/Sipa USA)

WASHINGTON — Fact-checks added to posts on a Facebook news feed were found to reduce users’ acceptance of misinformation on the platform, according to a new study published in the Journal of Politics.

The study, conducted by assistant professors Ethan Porter of George Washington University and Thomas Wood of Ohio State University, simulated Facebook news feeds and administered a survey to gauge users’ accuracy in identifying misinformation. When the respondents were exposed to misinformation and subsequent fact-checks, the researchers found their ability to identify untrue posts improved.

The experiment was administered through two large and nationally representative population samples, and the respondents in one group were free to ignore the fact-checked information just as they could on the genuine platform. Porter told The Well News the findings were consistent among both conservative and liberal respondents.

“There have been widespread concerns about misinformation on Facebook and those concerns have prompted interest in potential solutions,” Porter said. “And one possible solution is fact-checking, correcting posts that are false on the platform and then maximizing the reach of those who see corrections. We wanted to know if such a solution would work at all on correcting false beliefs generated by misinformation.”

The study’s first phase subjected users to randomized, apolitical “placebo” content and multiple posts containing misinformation along with factual corrections. The falsified content contained information that had actually circulated around Facebook concerning fake claims about former President Donald Trump, climate activist Greta Thunberg and Rep. Ilhan Omar, D-Minn., among others.

Respondents were asked to deduce the truthfulness of the posts on a one-to-five scale without a fact-check. The second phase explored whether the respondents were more, less or as likely to believe the fake posts after they were blurred and flagged with corrected information.

Users’ belief in misinformation was found to consistently be reduced when confronted with a fact-check. Porter said that the outcomes of users’ exposure to fact-checks could have been an increase in accuracy, a decrease in accuracy known as the “backfire effect,” or no change either way.

“You might be especially concerned about that possibility when it comes to Facebook or social media, given that people simply have a lot on their plate at any one time,” he said. “They have a lot of stuff in their newsfeed, so it’s not clear that the corrections will make a difference. We didn’t observe either backfire or inertness. Instead, we observed people becoming more accurate as a result of this simulation of a Facebook news feed.”

Both Porter and Wood coauthored a separate study on the so-called backfire effect in December 2017 that described how some fact-checks could lead users to believe more in false information than they had previously. Conservative respondents in that study were found to be more convinced of the U.S. military’s justifications for the Iraq invasion when they were presented with information maintaining that there were no weapons of mass destruction present.

The findings of Porter’s latest study oppose previous research that found issuing corrections on social media was ineffective at reducing misinformation. Even when users could ignore the corrected information, the researchers found they trusted the fact-checks more often than not.

“What Facebook needs to do is make sure that people who see misinformation see fact-checks,” Porter told The Well News. “Right now, Facebook’s policy is actually quite bizarre. When people promote, share or write up misinformation — if that misinformation is widely trafficked — Facebook will underwrite the cost of a fact-check by an outside fact-checking organization.”

Porter continued, “But if you or I stumble upon that misinformation shared by somebody else in our newsfeed, Facebook does not compel us to then see the subsequent fact-check which is paid for. So, I think the fix is pretty straightforward. If you see misinformation that is later fact-checked, you should then see the fact-check after you log on to Facebook.”

Reece can be reached at [email protected]

A+
a-
  • Ethan Porter
  • Facebook
  • George Washington University
  • Journal of Politics
  • misinformation
  • Ohio State University
  • Thomas Wood
  • In The News

    Health

    Voting

    Social Media

    A Supreme Court Ruling in a Social Media Case Could Set Standards for Free Speech in the Digital Age

    WASHINGTON (AP) — In a busy term that could set standards for free speech in the digital age, the Supreme... Read More

    WASHINGTON (AP) — In a busy term that could set standards for free speech in the digital age, the Supreme Court on Monday is taking up a dispute between Republican-led states and the Biden administration over how far the federal government can go to combat controversial social... Read More

    House Passes Bill That Would Lead to TikTok Ban if Chinese Owner Doesn't Sell. Senate Path Unclear

    WASHINGTON (AP) — The House on Wednesday passed a bill that would lead to a nationwide ban of the popular video app... Read More

    WASHINGTON (AP) — The House on Wednesday passed a bill that would lead to a nationwide ban of the popular video app TikTok if its China-based owner doesn't sell, as lawmakers acted on concerns that the company's current ownership structure is a national security threat. The bill, passed by... Read More

    Fake Images Made to Show Trump With Black Supporters Highlight Concerns Around AI and Elections

    WASHINGTON (AP) — At first glance, images circulating online showing former President Donald Trump surrounded by groups of Black people... Read More

    WASHINGTON (AP) — At first glance, images circulating online showing former President Donald Trump surrounded by groups of Black people smiling and laughing seem nothing out of the ordinary, but a look closer is telling. Odd lighting and too-perfect details provide clues to the fact they... Read More

    Florida Lawmakers Prepare for Gov. DeSantis to Veto Social Media Ban on Children Under 16

    TALLAHASSEE, Fla. (AP) — Florida lawmakers were bracing for Republican Gov. Ron DeSantis to veto a bill banning social media... Read More

    TALLAHASSEE, Fla. (AP) — Florida lawmakers were bracing for Republican Gov. Ron DeSantis to veto a bill banning social media for children under 16 on Friday and finding a way to make him more comfortable with its language before their session ends next week. DeSantis supports... Read More

    Russian Disinformation Is About Immigration. The Real Aim Is to Undercut Ukraine Aid

    WASHINGTON (AP) — For Vladimir Putin, victory in Ukraine may run through Texas' Rio Grande Valley. In recent weeks, Russian... Read More

    WASHINGTON (AP) — For Vladimir Putin, victory in Ukraine may run through Texas' Rio Grande Valley. In recent weeks, Russian state media and online accounts tied to the Kremlin have spread and amplified misleading and incendiary content about U.S. immigration and border security. The campaign seems... Read More

    February 26, 2024
    by Tom Ramstack
    States Try to Convince Supreme Court to Keep Laws Regulating Social Media

    WASHINGTON — The Supreme Court seemed skeptical Monday of two state laws that could redraw social media companies’ right to... Read More

    WASHINGTON — The Supreme Court seemed skeptical Monday of two state laws that could redraw social media companies’ right to decide what sort of content is displayed on their platforms. The tech companies say the Florida and Texas laws that seek to limit which content they... Read More

    News From The Well
    scroll top