Congress Tries to Take Down Social Media Disinformation
WASHINGTON — Social media experts described organized disinformation campaigns on social media as a threat to democracy during a congressional hearing Thursday.
Confusion created by the disinformation is leading some Americans to perceive threats to their health, safety or political leadership where none exist, witnesses told the House Intelligence Committee.
“Domestic disinformation now runs rampant,” said Nina Jankowicz, a fellow at the Wilson Center, a Washington, D.C.-based public policy foundation on international issues.
Previously the threat was primarily international, such as from Russia, China and Iran as they sought to use social media to promote their special interests, according to witnesses at the hearing. They mentioned Russia’s efforts in 2016 to support the presidential candidacy of Donald Trump while using propaganda to discredit his opponent.
Although the foreign threat continues, U.S.-based dissidents are adding to the disinformation, Jankowicz said.
“It does our adversaries’ work for them,” she said.
Members of Congress are considering options to intervene in the disinformation, such as legislation that clamps down on Facebook, Twitter and YouTube for allowing threatening or illicit information to be posted on their sites.
Other options would restrict their advertising content or use tax incentives to compel the Internet companies to remove disinformation.
Disinformation refers to false or misleading information that is spread deliberately to deceive.
However, members of the House Intelligence Committee also said they wanted to avoid the risks involved in controlling disinformation, which could lead to censorship.
Adam Schiff, D-Calif., chairman of the House Intelligence Committee, acknowledged that social media companies were making progress in blocking disinformation, particularly after the criticism they endured for allowing Russians to exploit them during the 2016 election.
“Social media companies bear some responsibility but the private sector alone” could not be blamed for all the disinformation, Schiff said.
He hinted at the possibility of new legislation but gave no details.
A primary strategy used by Facebook, Twitter and YouTube is sophisticated algorithms to identify and block false or misleading information. They admit that some disinformation still slips through the net of their computer code.
Schiff mentioned white supremacists and promoters of QAnon, a far-right conspiracy theory group, as examples of what he called a “pernicious” threat propagated through social media.
The QAnon conspiracy theory alleges that a group of Satan-worshiping pedophiles is running a global child sex-trafficking ring and plotting against Trump. The theory commonly asserts that Trump is planning a day of reckoning known as “The Storm,” when thousands of members of the group will be arrested. There is no evidence for any part of the theory.
Rep. Denny Heck, D-Wash., said that when he saw similar disinformation on the Internet, he thought, “How could anyone believe that?”
Recent examples of disinformation claim evidence of election interference and conspiracies that say COVID-19 is a sham epidemic.
A Facebook page called The Other 98% reported in August that mailboxes were being blocked by unknown persons to prevent mail-in voting. The post collected 39,000 likes and comments and reached 18 million viewers, according to CrowdTangle, a Facebook-owned tool for analyzing social media.
The media insights company Zignal Labs reported that nearly a fourth of references last month to voting by mail on television, print and online news were inaccurate.
Another false rumor spreading on Facebook says a group called “deep state” is interfering with the election by inventing the coronavirus pandemic.