Loading...

Can Online Platforms Properly Police Themselves?

March 26, 2021 by Victoria Turner
Facebook CEO Mark Zuckerberg

WASHINGTON – In 2018, Facebook CEO Mark Zuckerberg apologized to a U.S. House committee when presented with active drug trafficking posts on his site, admitting social media platforms need better “policing” of such content. 

Three years later, despite Facebook having “cleaned up its act,” the problem has now trickled down into its subsidiary Instagram, said Rep. David McKinley, R-W.Va., during yesterday’s joint hearing by two House subcommittees. 

“If we can find these [posts] this easily … shame on you for not finding them for yourself,” McKinley criticized the CEO. 

The platform should be held liable for allowing the distribution of this “poison” to vulnerable adults and children, he charged, just like a retail store selling tobacco products to minors would be, or drug manufacturers “dumping pills” into communities that lead to thousands of deaths. Some of these posts, he added, have been active since last fall. 

Zuckerberg admitted this was a “huge issue” but claimed it was nearly impossible to catch every bad actor given the size of the communities they operate in, sharing millions to billions of messages a day. 

“It is inevitable that we will not find everything just like a police force in a city will not stop every single crime,” Zuckerberg explained. 

He said that Congress should hold online platforms liable for building “effective” content moderation systems, not for bad actors that fall through their cracks. 

Publishers like newspapers, television or radio, are held accountable for their content and subject to fines or the collection of damages under the Communications Decency Act. In 1996, it adopted Section 230, a liability shield for online platforms from being held responsible for third-party content published on their sites.

Zuckerberg, who created an independent oversight board to monitor their content, said there were two changes he would support for Section 230 to “reflect…the modern reality.”

He suggested platforms should commit to regularly submitting transparency reports on the “prevalence” of harmful content like child pornography and exploitation, drug and sex trafficking, terrorist content, content provoking violence, and anything “clearly illegal.” Facebook, he added, has been doing a similar quarterly report on the frequency of this content and the efficacy of its system to capture it.

The second part of his proposal to “[reasonably]…condition immunity” to hold the larger online platforms accountable for moderating and removing this content. 

Both Google CEO Sundar Pichai and Twitter CEO Jack Dorsey agreed that more transparency and accountability is needed. Nevertheless, Dorsey, who did not even mention Section 230 in his written testimony, cautioned on the risk of government oversight telling companies what should or not be published. 

“We may end up with a service that couldn’t be used to question the government,” he said. 

Zuckerberg added, however, the liability shield should not be removed for the smaller platforms trying to compete. They do not have the same scale that these large tech giants have to build out the systems needed to monitor content. 

Dorsey agreed that smaller companies have a harder time at reckoning with regulations that put “enormous resource requirements on businesses and services, which would further entrench those who are able to afford it.” 

But lawmakers questioned how capable these tech giants actually are in policing themselves, given how much harmful content still circulates.

Earlier this year, a YouTube video showcasing a homemade baby formula led to the hospitalization of two infants in Delaware, charged Rep. Lisa Blunt Rochester, D-Del. Despite FDA advisories against homemade formulas, the video was posted and one of the babies was left with brain damage after cardiac arrest. 

Alongside other examples that came up during the hearing, she said, this showcased how “we should be concerned by all of your abilities to adequately, and just as importantly, rapidly moderate content.” 

“In some of these cases, we’re talking life and death,” Rochester said.

Social Media

November 29, 2021
by Dan McCue
Twitter CEO Jack Dorsey Stepping Down

SAN FRANCISCO — Jack Dorsey is stepping aside as the head and CEO of Twitter, believing the social media giant... Read More

SAN FRANCISCO — Jack Dorsey is stepping aside as the head and CEO of Twitter, believing the social media giant “Is ready to move on from its founders.” Dorsey has faced pressure for months over his decision to serve as CEO for both Twitter and Square,... Read More

November 24, 2021
by Dan McCue
Study Finds Significant Bipartisan Support for Corporate Social Responsibility

WASHINGTON — A new, groundbreaking study suggests not only is there strong bipartisan support for corporate efforts to address environmental,... Read More

WASHINGTON — A new, groundbreaking study suggests not only is there strong bipartisan support for corporate efforts to address environmental, social and governance challenges, but that the bipartisan appeal of these initiatives dramatically increases among Americans under the age of 45. The study, “Unlocking the Bipartisan... Read More

November 23, 2021
by Tom Ramstack
Justice Dept. Intervenes Against Trump In Lawsuit Against Social Media Giants

WASHINGTON -- The Justice Department intervened this week in a lawsuit former President Donald Trump filed against social media companies... Read More

WASHINGTON -- The Justice Department intervened this week in a lawsuit former President Donald Trump filed against social media companies Facebook, Twitter and Google’s YouTube. Trump accuses the companies of violating his First Amendment right to free speech by banning his postings after the Jan. 6... Read More

November 23, 2021
by Dan McCue
Pixstory Striving to Address the Need for User Safety On Social Media

WASHINGTON - Social media, once a venue for pet pictures and catching up with friends from high school, has turned... Read More

WASHINGTON - Social media, once a venue for pet pictures and catching up with friends from high school, has turned toxic in many cases. And it’s almost impossible to escape the most divisive of political commentary. Among those trying to address the issue is Appu Esthose... Read More

November 23, 2021
by Reece Nations
NYU Study Finds Twitter Warnings May Reduce Hate Speech

NEW YORK — Researchers at New York University’s Center for Social Media and Politics found that issuing warnings of possible... Read More

NEW YORK — Researchers at New York University’s Center for Social Media and Politics found that issuing warnings of possible suspensions resulting from the use of hate speech on Twitter reduced the ratio of tweets containing hateful language by up to 10% in the week following... Read More

November 12, 2021
by Reece Nations
Meta Changes Targeted Advertising Policy, Prompting Political Outcry

WASHINGTON — Meta Platforms, Inc., the parent company of Facebook and other social media platforms, announced that starting Jan. 19... Read More

WASHINGTON — Meta Platforms, Inc., the parent company of Facebook and other social media platforms, announced that starting Jan. 19 it will place new limits on advertisers' ability to target users based on their interactions with content related to health, race and ethnicity, political affiliation, religion... Read More

News From The Well
Exit mobile version