Can Online Platforms Properly Police Themselves?

March 26, 2021 by Victoria Turner
Facebook CEO Mark Zuckerberg

WASHINGTON – In 2018, Facebook CEO Mark Zuckerberg apologized to a U.S. House committee when presented with active drug trafficking posts on his site, admitting social media platforms need better “policing” of such content. 

Three years later, despite Facebook having “cleaned up its act,” the problem has now trickled down into its subsidiary Instagram, said Rep. David McKinley, R-W.Va., during yesterday’s joint hearing by two House subcommittees. 

“If we can find these [posts] this easily … shame on you for not finding them for yourself,” McKinley criticized the CEO. 

The platform should be held liable for allowing the distribution of this “poison” to vulnerable adults and children, he charged, just like a retail store selling tobacco products to minors would be, or drug manufacturers “dumping pills” into communities that lead to thousands of deaths. Some of these posts, he added, have been active since last fall. 

Zuckerberg admitted this was a “huge issue” but claimed it was nearly impossible to catch every bad actor given the size of the communities they operate in, sharing millions to billions of messages a day. 

“It is inevitable that we will not find everything just like a police force in a city will not stop every single crime,” Zuckerberg explained. 

He said that Congress should hold online platforms liable for building “effective” content moderation systems, not for bad actors that fall through their cracks. 

Publishers like newspapers, television or radio, are held accountable for their content and subject to fines or the collection of damages under the Communications Decency Act. In 1996, it adopted Section 230, a liability shield for online platforms from being held responsible for third-party content published on their sites.

Zuckerberg, who created an independent oversight board to monitor their content, said there were two changes he would support for Section 230 to “reflect…the modern reality.”

He suggested platforms should commit to regularly submitting transparency reports on the “prevalence” of harmful content like child pornography and exploitation, drug and sex trafficking, terrorist content, content provoking violence, and anything “clearly illegal.” Facebook, he added, has been doing a similar quarterly report on the frequency of this content and the efficacy of its system to capture it.

The second part of his proposal to “[reasonably]…condition immunity” to hold the larger online platforms accountable for moderating and removing this content. 

Both Google CEO Sundar Pichai and Twitter CEO Jack Dorsey agreed that more transparency and accountability is needed. Nevertheless, Dorsey, who did not even mention Section 230 in his written testimony, cautioned on the risk of government oversight telling companies what should or not be published. 

“We may end up with a service that couldn’t be used to question the government,” he said. 

Zuckerberg added, however, the liability shield should not be removed for the smaller platforms trying to compete. They do not have the same scale that these large tech giants have to build out the systems needed to monitor content. 

Dorsey agreed that smaller companies have a harder time at reckoning with regulations that put “enormous resource requirements on businesses and services, which would further entrench those who are able to afford it.” 

But lawmakers questioned how capable these tech giants actually are in policing themselves, given how much harmful content still circulates.

Earlier this year, a YouTube video showcasing a homemade baby formula led to the hospitalization of two infants in Delaware, charged Rep. Lisa Blunt Rochester, D-Del. Despite FDA advisories against homemade formulas, the video was posted and one of the babies was left with brain damage after cardiac arrest. 

Alongside other examples that came up during the hearing, she said, this showcased how “we should be concerned by all of your abilities to adequately, and just as importantly, rapidly moderate content.” 

“In some of these cases, we’re talking life and death,” Rochester said.

In The News

Health

Voting

Social Media

Facebook Notifications Are Not Akin to Robocalls
Social Media
Facebook Notifications Are Not Akin to Robocalls
April 1, 2021
by Dan McCue

WASHINGTON - A unanimous Supreme Court sided with Facebook on Thursday, ruling that a notification system the social media giant employs to alert users to suspicious logins does not run afoul of a federal law aimed at curbing robocalls and automated text messages. The case revolved... Read More

Event Looks at Free Expression in the Digital Age
Technology
Event Looks at Free Expression in the Digital Age
March 30, 2021
by Victoria Turner

David Freiheit began his presentation at the American Enterprise Institute event on the values and consequences of free expression in the digital age by pointing to the response of Sen. Elizabeth Warren, D-Mass., to Amazon’s “snotty tweets.” Warren has repeatedly vowed to break up the Big... Read More

Posting Vaccine Cards Online Could Attract Scammers
Privacy
Posting Vaccine Cards Online Could Attract Scammers
March 29, 2021
by Alexa Hornbeck

As vaccine eligibility expands to those 16 and over in many states, the Federal Bureau of Investigation is warning those getting shots against posting photos of their vaccine cards online.  “I’ve seen people wanting to be proud and show off that they got their first vaccine,... Read More

Can Online Platforms Properly Police Themselves?
Social Media
Can Online Platforms Properly Police Themselves?
March 26, 2021
by Victoria Turner

WASHINGTON - In 2018, Facebook CEO Mark Zuckerberg apologized to a U.S. House committee when presented with active drug trafficking posts on his site, admitting social media platforms need better “policing” of such content.  Three years later, despite Facebook having “cleaned up its act,” the problem... Read More

Social Media Platforms’ Self-Regulation Era is Over
Social Media
Social Media Platforms’ Self-Regulation Era is Over
March 25, 2021
by Victoria Turner

WASHINGTON - Rep. Cathy McMorris Rogers, R-Wash., said at a congressional hearing today that as a parent of three school-aged children, Facebook, YouTube and Twitter are her “biggest fear.”  After spikes in teenage suicides in her community, everyone she reached out to “[raised] the alarm about... Read More

Judge Holds Off on Approval of TikTok Settlement
Social Media
Judge Holds Off on Approval of TikTok Settlement
March 2, 2021
by Sara Wilkerson

CHICAGO - A federal judge on Tuesday put off approval of a proposed $92 million class-action settlement by the social media app TikTok, wanting to give attorneys at least 21 days to address his questions about the proposal. U.S. District Judge John Lee gave the attorneys... Read More

News From The Well
scroll top