Social Media Faces New Regulations as it is Blamed for Inciting Extremists
WASHINGTON — Few lawmakers or witnesses disputed whether the government should crack down on social media companies like Facebook during a Senate hearing Thursday, only how it should be done.
The Senate is concerned that social media amplifies threats of domestic extremism and dangerous misinformation.
“It’s simply not enough for companies to pledge that they will get tougher on harmful content,” said Sen. Gary Peters, D-Mich., chairman of the Homeland Security and Governmental Affairs Committee. “Those pledges have gone largely unfulfilled for several years now.”
Much of the problem lies in the complex mathematical algorithms that social media companies use to attract and retain users. They call it “engagement.”
The companies monitor users’ choices of online content based on the keywords they type in and icons they select. The algorithms then use probabilities to direct similar content and advertising to them, which also increases the companies’ revenue.
However, market research shows that incendiary content expressing extremist ideas attracts the most users, thereby driving the most online traffic and generating the most ad revenue for Facebook, Twitter, YouTube and other companies.
The result has been that social media has been blamed for inadvertently encouraging violence and political radicals.
Examples mentioned at the hearing included the Jan. 6 insurrection at the U.S. Capitol, misinformation that persuaded some people not to get COVID-19 vaccines and the Oct. 27, 2018 white supremacist shooting at a Pittsburgh synagogue that killed 11 members of the congregation.
Several senators mentioned foreign-based national security risks, such as efforts by Islamic extremists that use Facebook to organize attacks.
“Today foreign actors continue to try to weaponize social media,” said Sen. Rob Portman, R-Ohio. “China and Russia use these platforms to try to influence Americans.”
At the same time, he acknowledged that filtering harmful content from among billions of social media users worldwide would be difficult, particularly when most of them are sharing harmless information about family and friends.
“This is not an easy issue for government to be involved in,” Portman said.
One proposed solution would amend Section 230 of the Communications Decency Act, which gives social media companies immunity from liability for content their users post. It says, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Supporters of an amendment say it would force the companies to act more responsibly.
Facebook Chief Executive Mark Zuckerberg has told Congress that his company tries to filter hateful, sexual and violent content. He also said removing the shield on liability could subject social media to endless lawsuits, possibly driving them out of business.
Other regulatory interventions discussed by lawmakers would make the companies’ content filtering algorithms more aggressive or set up a digital code of conduct they would be required to follow. They could be fined for violating the code of conduct.
As it stands now, social media makes destructive content a lesser priority because “the algorithms are economically driven,” said Karen Kornbluh, director of digital innovation for the German Marshall Fund of the United States, a public policy foundation.
Other senators and witnesses urged the Senate to look before it leaps into what could be ill-advised regulation.
Nathaniel Persily, a Stanford Law School professor of cyber policy, suggested that lawmakers more deeply review the algorithms social media companies use to filter content before imposing new requirements on them.
“Only if we can get access to this data can we regulate intelligently,” Persily said.
So far, social media companies have kept their algorithms secret as proprietary business information.
Sen. Mitt Romney, R-Utah, expressed concern that too much regulation would trample free speech rights of social media companies and their users.
“I don’t know how government does that consistent with the First Amendment,” Romney said.
The Senate hearing caps a series of bad incidents for Facebook this month.
At a different Senate hearing, a former Facebook product manager testified the social media giant puts profits over social responsibility. She mostly blamed Zuckerberg, who holds 55% voting control over the company.
In addition, the District of Columbia’s attorney general announced he was trying to compel Zuckerberg to testify in a consumer protection lawsuit.
The District of Columbia’s lawsuit accuses Facebook of failing to protect its users’ personal information before political consulting firm Cambridge Analytica mined the website for data to help Donald Trump’s 2016 presidential campaign. The Trump campaign hired Cambridge Analytica.
The lawsuit says Zuckerberg participated in corporate decisions that allowed the firm to obtain data about Facebook users most likely to support Trump, despite the fact it violated their privacy.
Tom can be reached at [email protected].