Senators Charge Addiction Drives Social Media Platforms’ Business Models
WASHINGTON – Even after the call-to-arms of the Kenosha Guard militia group had been flagged on Facebook 450 times, it was not taken down because it did not “meet the standards” for removal, said Sen. Dick Durbin, D-Ill., at a Senate hearing on Tuesday. This led to the deaths of two protestors shot with an AR-15 by Kyle Rittenhouse, who drove from Illinois to Wisconsin to answer the call last August.
According to Durbin, chairman of the Senate Judiciary Committee, the “consequences” – from Kenosha to January’s Capitol storming – of the algorithms embedded into social media platforms “have never been clearer.” These algorithms have not only changed the way people engage and what they watch, buy and read, but “can drive people to a self-reinforcing echo chamber of extremism,” he said at a subcommittee hearing on the impact the platforms have in shaping the choices, opinions and actions of everyday Americans.
Even though witnesses from social media platforms Facebook, Twitter and Google’s YouTube listed their efforts to mitigate the spread of disinformation or harmful content, the lawmakers slammed their business models, which they charged are primarily based on addiction.
“The business model is addiction. Money is directly correlated to the amount of time that people spend on the site,” said Sen. Ben Sasse, R-Neb.
The “power” of these algorithms “to manipulate social media addiction,” especially of teenagers, is “something that should terrify each of us,” said Sen. Marsha Blackburn, R-Tenn.
This is not the first time the platforms have testified before Congress on how the algorithms embedded in their sites affect their users, with lawmakers questioning whether the platforms can properly police themselves absent federal regulation.
Yet, as the world transitions into a digital economy, “existential threats” are growing exponentially quicker than the country’s’ ability to “mitigate or respond” to them, said Tristan Harris, former design ethicist at Google and now president of Center for Humane Technology. For every 200 billion daily posts in Facebook’s WhatsApp, he noted, fact-checkers review only 100.
One particular threat he mentioned was China’s rise in the world’s transition into a “digital society.” China, he explained, has a “digital closed society” of surveillance, censorship, thought-control, behavior modification and the like.
On the other hand, he said, the U.S. has a digital “open” society that has turned Americans into a culture that is “constantly immersed in distractions” and “unable to focus on our real problems” like increasing suicide rates from cyberbullying, pandemic mis- and disinformation, civil rights and national security. If China were to fly a fighter jet above the U.S., the Defense Department would shoot it down. “But if they try to fly an information bomb, they are met with a white-gloved algorithm” by one of the platforms asking them which ZIP code they’d want to target, he claimed.
The U.S. values free speech more than most other values, he said, but this comes at a cost when free speech is used as “a Frankenstein monster that spins out blocks of attention virally” in personalized ways to different people to “outrage” them. Harris explained that this personalization just leads a user into their own “rabbit hole of reality.”
The problem, he said, is not just the business model based on engagement, but the actual design model of the platforms themselves, which have created “yellow journalists” out of their users to generate “attention production” for letting loose our “five minutes of moral outrage,” which the users then share with each other, he said. By using algorithms in their editorial strategy instead of a human sorting the content for the user, he added, they have created a “values-blind process” that then allows harms to pop up “in all the blindspots.”
And “value blindness destroys our democracy faster than people are essentially raising the alarms,” he said.
This is not to say that the platforms have not been trying their best to curtail these issues, he said, but they are “trapped” by their algorithms and end up favoring content that is “addicted, outraged, polarized, narcissistic and disinformed.”
The more “extreme” the content, the more “likes” and “follows” a user will get, and the more the user will continue to engage in this “attention treadmill,” he explained, because “if it bleeds, it leads.”
In The News
WASHINGTON -- If everyone were to employ proper cyber hygiene like multi-factor authentication or not clicking on links in phishing... Read More
WASHINGTON -- If everyone were to employ proper cyber hygiene like multi-factor authentication or not clicking on links in phishing emails, more than 85% of cyberattacks would be prevented, said Sen. Angus King, I-Maine, Thursday. “The best hack is the one that doesn’t happen,” King said... Read More
Social entrepreneurship uses technology to bridge the gap between “cash-strapped” municipalities and local governments, said Gabriella Wong, CEO and founder... Read More
Social entrepreneurship uses technology to bridge the gap between “cash-strapped” municipalities and local governments, said Gabriella Wong, CEO and founder of AccesSOS, Tuesday. Wong’s comment came during the Camelback Venture’s Guardian Summit panel discussion where she was joined by two other socially-driven app founders and executives... Read More
WASHINGTON -- Imagine knowing your daughter’s moment of death has been immortalized in a video online. Now imagine that it... Read More
WASHINGTON -- Imagine knowing your daughter’s moment of death has been immortalized in a video online. Now imagine that it keeps popping up over and over again on Facebook, Instagram, and Google’s YouTube for six years and counting, despite your family's repeated requests for its removal. ... Read More
WASHINGTON -- A former Facebook product manager told a Senate panel Tuesday that the social media giant had the potential... Read More
WASHINGTON -- A former Facebook product manager told a Senate panel Tuesday that the social media giant had the potential for great good but instead was focused only on its own profits. “The choices being made inside of Facebook are disastrous,” said Frances Haugen. She put... Read More
NEW YORK (AP) — A data scientist who was revealed Sunday as the Facebook whistleblower says that whenever there was... Read More
NEW YORK (AP) — A data scientist who was revealed Sunday as the Facebook whistleblower says that whenever there was a conflict between the public good and what benefited the company, the social media giant would choose its own interests. Frances Haugen was identified in a... Read More
YouTube revamped its medical misinformation policies on Wednesday, advancing a new set of guidelines on vaccines that have already been... Read More
YouTube revamped its medical misinformation policies on Wednesday, advancing a new set of guidelines on vaccines that have already been approved and confirmed safe and effective by health authorities, including the World Health Organization. In a blog post on its website attributed to “The YouTube Team,”... Read More