Can Online Platforms Properly Police Themselves?
WASHINGTON – In 2018, Facebook CEO Mark Zuckerberg apologized to a U.S. House committee when presented with active drug trafficking posts on his site, admitting social media platforms need better “policing” of such content.
Three years later, despite Facebook having “cleaned up its act,” the problem has now trickled down into its subsidiary Instagram, said Rep. David McKinley, R-W.Va., during yesterday’s joint hearing by two House subcommittees.
“If we can find these [posts] this easily … shame on you for not finding them for yourself,” McKinley criticized the CEO.
The platform should be held liable for allowing the distribution of this “poison” to vulnerable adults and children, he charged, just like a retail store selling tobacco products to minors would be, or drug manufacturers “dumping pills” into communities that lead to thousands of deaths. Some of these posts, he added, have been active since last fall.
Zuckerberg admitted this was a “huge issue” but claimed it was nearly impossible to catch every bad actor given the size of the communities they operate in, sharing millions to billions of messages a day.
“It is inevitable that we will not find everything just like a police force in a city will not stop every single crime,” Zuckerberg explained.
He said that Congress should hold online platforms liable for building “effective” content moderation systems, not for bad actors that fall through their cracks.
Publishers like newspapers, television or radio, are held accountable for their content and subject to fines or the collection of damages under the Communications Decency Act. In 1996, it adopted Section 230, a liability shield for online platforms from being held responsible for third-party content published on their sites.
Zuckerberg, who created an independent oversight board to monitor their content, said there were two changes he would support for Section 230 to “reflect…the modern reality.”
He suggested platforms should commit to regularly submitting transparency reports on the “prevalence” of harmful content like child pornography and exploitation, drug and sex trafficking, terrorist content, content provoking violence, and anything “clearly illegal.” Facebook, he added, has been doing a similar quarterly report on the frequency of this content and the efficacy of its system to capture it.
The second part of his proposal to “[reasonably]…condition immunity” to hold the larger online platforms accountable for moderating and removing this content.
Both Google CEO Sundar Pichai and Twitter CEO Jack Dorsey agreed that more transparency and accountability is needed. Nevertheless, Dorsey, who did not even mention Section 230 in his written testimony, cautioned on the risk of government oversight telling companies what should or not be published.
“We may end up with a service that couldn’t be used to question the government,” he said.
Zuckerberg added, however, the liability shield should not be removed for the smaller platforms trying to compete. They do not have the same scale that these large tech giants have to build out the systems needed to monitor content.
Dorsey agreed that smaller companies have a harder time at reckoning with regulations that put “enormous resource requirements on businesses and services, which would further entrench those who are able to afford it.”
But lawmakers questioned how capable these tech giants actually are in policing themselves, given how much harmful content still circulates.
Earlier this year, a YouTube video showcasing a homemade baby formula led to the hospitalization of two infants in Delaware, charged Rep. Lisa Blunt Rochester, D-Del. Despite FDA advisories against homemade formulas, the video was posted and one of the babies was left with brain damage after cardiac arrest.
Alongside other examples that came up during the hearing, she said, this showcased how “we should be concerned by all of your abilities to adequately, and just as importantly, rapidly moderate content.”
“In some of these cases, we’re talking life and death,” Rochester said.
In The News
Reforming Section 230 of the 1996 Communications Decency Act is not enough to curb the power of social media platforms, and should be bolstered by federal and state laws to enforce transparency, accountability and anti-discrimination, FCC Commissioner Brendan Carr said. Despite the “tremendous benefits” the tech... Read More
BEDMINSTER, N.J. - Former President Donald Trump filed a class action lawsuit against Facebook, Twitter and Google, and their CEOs on Wednesday, alleging the companies violated his 1st Amendment rights when they banned him and suspended his accounts. “Our case will prove this censorship is unlawful,... Read More
WASHINGTON - Former top Trump advisor Jason Miller will launch GETTR, a new social media platform, on July 4, a launch date chosen to “symbolically declare independence from Big Tech,” according to a release from the company. “There is a clear need for a new social... Read More
Last May, as Twitter was testing warning labels for false and misleading tweets, it tried out the word "disputed" with a small focus group. It didn't go over well. "People were like, well, who's disputing it?" said Anita Butler, a San Francisco-based design director at Twitter... Read More
WASHINGTON - Former President Donald Trump on Saturday rolled out his new social media presence, inviting recipients of emails to his Save America PAC to subscribe to his page on rumble.com. The invite was sent out just hours before the return of his political rallies, with... Read More
WASHINGTON - A heated debate took place in the House Judiciary Committee markup of a merger fee legislation that skated through the Senate, but their congressional counterparts took three hours Wednesday to come close to an agreement. The single amendment bill entitled, Merger Filing Fee Modernization... Read More