Senators Charge Addiction Drives Social Media Platforms’ Business Models

April 28, 2021 by Victoria Turner
(Dreamstime/TNS)

WASHINGTON – Even after the call-to-arms of the Kenosha Guard militia group had been flagged on Facebook 450 times, it was not taken down because it did not “meet the standards” for removal, said Sen. Dick Durbin, D-Ill., at a Senate hearing on Tuesday. This led to the deaths of two protestors shot with an AR-15 by Kyle Rittenhouse, who drove from Illinois to Wisconsin to answer the call last August.

According to Durbin, chairman of the Senate Judiciary Committee, the “consequences” – from Kenosha to January’s Capitol storming – of the algorithms embedded into social media platforms “have never been clearer.” These algorithms have not only changed the way people engage and what they watch, buy and read, but “can drive people to a self-reinforcing echo chamber of extremism,” he said at a subcommittee hearing on the impact the platforms have in shaping the choices, opinions and actions of everyday Americans. 

Even though witnesses from social media platforms Facebook, Twitter and Google’s YouTube listed their efforts to mitigate the spread of disinformation or harmful content, the lawmakers slammed their business models, which they charged are primarily based on addiction.

“The business model is addiction. Money is directly correlated to the amount of time that people spend on the site,” said Sen. Ben Sasse, R-Neb. 

The “power” of these algorithms “to manipulate social media addiction,” especially of teenagers, is “something that should terrify each of us,” said Sen. Marsha Blackburn, R-Tenn. 

This is not the first time the platforms have testified before Congress on how the algorithms embedded in their sites affect their users, with lawmakers questioning whether the platforms can properly police themselves absent federal regulation. 

Yet, as the world transitions into a digital economy, “existential threats” are growing exponentially quicker than the country’s’ ability to “mitigate or respond” to them, said Tristan Harris, former design ethicist at Google and now president of Center for Humane Technology. For every 200 billion daily posts in Facebook’s WhatsApp, he noted, fact-checkers review only 100. 

One particular threat he mentioned was China’s rise in the world’s transition into a “digital society.” China, he explained, has a “digital closed society” of surveillance, censorship, thought-control, behavior modification and the like. 

On the other hand, he said, the U.S. has a digital “open” society that has turned Americans into a culture that is “constantly immersed in distractions” and “unable to focus on our real problems” like increasing suicide rates from cyberbullying, pandemic mis- and disinformation, civil rights and national security. If China were to fly a fighter jet above the U.S., the Defense Department would shoot it down. “But if they try to fly an information bomb, they are met with a white-gloved algorithm” by one of the platforms asking them which ZIP code they’d want to target, he claimed. 

The U.S. values free speech more than most other values, he said, but this comes at a cost when free speech is used as “a Frankenstein monster that spins out blocks of attention virally” in personalized ways to different people to “outrage” them. Harris explained that this personalization just leads a user into their own “rabbit hole of reality.”

The problem, he said, is not just the business model based on engagement, but the actual design model of the platforms themselves, which have created “yellow journalists” out of their users to generate “attention production” for letting loose our “five minutes of moral outrage,” which the users then share with each other, he said. By using algorithms in their editorial strategy instead of a human sorting the content for the user, he added, they have created a “values-blind process” that then allows harms to pop up “in all the blindspots.” 

And “value blindness destroys our democracy faster than people are essentially raising the alarms,” he said. 

This is not to say that the platforms have not been trying their best to curtail these issues, he said, but they are “trapped” by their algorithms and end up favoring content that is “addicted, outraged, polarized, narcissistic and disinformed.” 

The more “extreme” the content, the more “likes” and “follows” a user will get, and the more the user will continue to engage in this “attention treadmill,” he explained, because “if it bleeds, it leads.” 

In The News

Health

Voting

Social Media

Oversight Board Upholds Facebook Ban on Trump, With Caveat
Social Media
Oversight Board Upholds Facebook Ban on Trump, With Caveat
May 5, 2021
by Dan McCue

Facebook’s Oversight Board has upheld the social media platform’s suspension of former President Donald Trump’s Facebook account, but in doing so, it said the company failed to impose the penalty properly. “It is not permissible for Facebook to keep a user off the platform for an... Read More

Senators Charge Addiction Drives Social Media Platforms’ Business Models
Social Media
Senators Charge Addiction Drives Social Media Platforms’ Business Models
April 28, 2021
by Victoria Turner

WASHINGTON - Even after the call-to-arms of the Kenosha Guard militia group had been flagged on Facebook 450 times, it was not taken down because it did not “meet the standards” for removal, said Sen. Dick Durbin, D-Ill., at a Senate hearing on Tuesday. This led... Read More

Cicilline: Congress Must Curb Power of Google and Facebook
Technology
Cicilline: Congress Must Curb Power of Google and Facebook
April 21, 2021
by Victoria Turner

WASHINGTON - The dominance of Google and Facebook as “gatekeepers” for online information and massive market power, particularly in digital advertising, has left U.S. journalism in a “state of crisis” and “Americans have had enough,”  said Rep. David Cicilline, D-R.I., Tuesday.  “Republicans and Democrats agree that... Read More

Facebook Notifications Are Not Akin to Robocalls
Social Media
Facebook Notifications Are Not Akin to Robocalls
April 1, 2021
by Dan McCue

WASHINGTON - A unanimous Supreme Court sided with Facebook on Thursday, ruling that a notification system the social media giant employs to alert users to suspicious logins does not run afoul of a federal law aimed at curbing robocalls and automated text messages. The case revolved... Read More

Event Looks at Free Expression in the Digital Age
Technology
Event Looks at Free Expression in the Digital Age
March 30, 2021
by Victoria Turner

David Freiheit began his presentation at the American Enterprise Institute event on the values and consequences of free expression in the digital age by pointing to the response of Sen. Elizabeth Warren, D-Mass., to Amazon’s “snotty tweets.” Warren has repeatedly vowed to break up the Big... Read More

Posting Vaccine Cards Online Could Attract Scammers
Privacy
Posting Vaccine Cards Online Could Attract Scammers
March 29, 2021
by Alexa Hornbeck

As vaccine eligibility expands to those 16 and over in many states, the Federal Bureau of Investigation is warning those getting shots against posting photos of their vaccine cards online.  “I’ve seen people wanting to be proud and show off that they got their first vaccine,... Read More

News From The Well
scroll top