Facebook Official Defends Deepfake Policy Before House Panel

January 9, 2020 by Kate Michael
Paul Scharre views in his offices in Washington, DC January 25, 2019 a manipulated video by BuzzFeed with filmmaker Jordan Peele (R on screen) using readily available software and applications to change what is said by former president Barack Obama, illustrating how deepfake technology can deceive viewers. (Rober Lever/AFP via Getty Images/TNS)

WASHINGTON – A Facebook official told a House panel on Wednesday that the social media giant is “working proactively to remove harmful content,” but critics maintained the company has a “trust problem” that warrants greater government scrutiny.

The showdown between advocates of freedom of speech and greater government regulation took place Wednesday morning during a hearing of the House Subcommittee on Consumer Protection and Commerce. 

Entitled “Americans at Risk: Manipulation and Deception in the Digital Age,” the hearing’s main draw was the testimony of Monika Bickert, Facebook’s vice president of global policy and management. 

Facebook recently unveiled its latest new policy prohibiting misleading and manipulated media, otherwise known as deepfakes. 

Under this policy, the company has vowed to “remove videos that have been edited or synthesized using artificial intelligence or deep learning techniques in ways that are not apparent to an average person and that would mislead an average person to believe that the subject of the video said words that they did not say,” Bickert said. 

Yet questions on the policy, its extent, and its enforcement remain.

“Facebook has a trust problem,” asserted Justin Hurwitz, an associate professor of law at the University of Nebraska, at the hearing,

Also testifying Wednesday were Joan Donovan, research director at Harvard Kennedy School’s Shorenstein Center on Media Politics and Public Policy, and Tristan Harris, executive director of the Center for Humane Technology. 

Lawmakers expressed concern not only over digital misinformation, but about general deception and abuse. Subcommittee Chair Rep. Jan Schakowsky, D-Ill., suggested that over the last decade, the government’s “laissez-faire” approach to the topic of online data and digital manipulation has been “wholly inadequate,” pointing to a recent so-called  ‘cheapfake’ video of Nancy Pelosi that went viral on Facebook. 

A cheapfake is an altered audio or video, whereas a deepfake is entirely fabricated. Both deep- and cheapfakes can be made with simple, readily available  technology, and both are part of the “emerging economy of misinformation,” according to Donovan, who warned of an impending “crisis of counterfeits.”

“If you can make it a trend, you can make it true,” said Harris, and “by not acting, we are subsidizing societal self-destruction.”

Facebook is “working proactively to remove harmful content,” argued Bickert, explaining how it uses both its community standards practices and a relationship with third-party fact-checkers to accomplish this. 

Manipulated media violates Facebook’s Community Standards policies and is eligible to be both fact-checked and removed. An individual poster can protest this determination, and has the opportunity to amend the content, she explained, 

“Speech, even when inaccurate, is protected,” said Rep. Greg Walden, R-Ore. 

However, he appeared open to revision of Section 230 of the Communications Decency Act, which has been criticized for acting as a legal liability shield for third-party content on social networking sites like Facebook.

Modifying Section 230 may address concerns regarding nefarious targeting of race and/or regional location to disseminate political misinformation, particularly worrisome due to the upcoming campaign season.  

Bickert pointed to training and tools provided by Facebook to help those most at risk realize when they are being targeted, as well as Facebook’s efforts to remove those networks and expose them publicly when they are identified.  

But the question of whether it is an individual’s responsibility to educate themselves or the government’s responsibility to protect people from false information remains unresolved.

Hurwtiz cautioned that efforts to regulate “dark patterns” and corrupt persuasive design would likely have repercussions. He suggested that “the worst of this” already falls within the existing authority of the FTC. “We already have an agency that has power over this.  Let’s see what it is capable of.”  

Bickert said she welcomed the opportunity for Facebook to collaborate to develop a consistent industry approach and standards to cooperate in self-regulating online content. 

She called attention to publicly available bi-annual reports showcasing media manipulation abuses on Facebook by amount and type. And, she said, Facebook’s numbers have been trending in a favorable direction.

Social Media

Gottheimer Seeks to Strengthen Protections Against Terrorist Activity on Social Media
Social Media
Gottheimer Seeks to Strengthen Protections Against Terrorist Activity on Social Media
September 10, 2020
by Daniel Londono

Rep. Josh Gottheimer wants to hold social media platforms accountable when terrorist organizations use their website portals to raise funds and spread propaganda. A proposal by the New Jersey Democrat, the Online Arsenal to Combat Foreign Terrorist Organizations, would fine Twitter, Facebook, LinkedIn and other social... Read More

Facebook to Ban New Political Ads in Week Leading Up to Election
Social Media
Facebook to Ban New Political Ads in Week Leading Up to Election

Facebook is trying to learn from its mistakes. CEO Mark Zuckerberg announced Thursday morning that the social media site will roll out a series of new measures to avoid a repeat of 2016, including banning any new political ads for the week leading up to Election... Read More

Twitter Says Direct Messages Accessed in 36 of 130 Hack Victims
Social Media
Twitter Says Direct Messages Accessed in 36 of 130 Hack Victims

Twitter Inc. completed its review of the 130 accounts that were hacked on its social network last week and discovered that the attackers accessed direct messages for as many as 36 of them, including one elected official in the Netherlands. Twitter’s analysis offered no indication that... Read More

Pandemic Doesn't Prevent Musician from Completing 'Project of a Lifetime'
Entertainment
Pandemic Doesn't Prevent Musician from Completing 'Project of a Lifetime'
July 8, 2020
by Dan McCue

For the musician Nick Parker, a dream 17 years in the making was just coming to fruition when the coronavirus outbreak suddenly brought the world's arts and entertainment industries to a screeching halt. The son of the acclaimed artist Robert Andrew Parker, whose work hangs in... Read More

COVID-19 Meets the 2020 Election: The Perfect Storm for Misinformation
Social Media
COVID-19 Meets the 2020 Election: The Perfect Storm for Misinformation

SEATTLE — When a mysterious virus began racing around the globe early this year, scientists at the University of Washington’s newly created Center for an Informed Public described it as the perfect storm for bogus information, both innocent and malicious. So what’s the situation six months... Read More

Social Media Platforms Gird for 78 Days of Disinformation Chaos After Election Day
Social Media
Social Media Platforms Gird for 78 Days of Disinformation Chaos After Election Day

WASHINGTON — The 78 days between Election Day this fall and Inauguration Day next January could be a greatly unsettled time for American democracy. Unlike most presidential elections, when ballots are tallied and counted in a majority of precincts by midnight on Election Day and news... Read More

News From The Well
scroll top