Facebook Official Defends Deepfake Policy Before House Panel
WASHINGTON – A Facebook official told a House panel on Wednesday that the social media giant is “working proactively to remove harmful content,” but critics maintained the company has a “trust problem” that warrants greater government scrutiny.
The showdown between advocates of freedom of speech and greater government regulation took place Wednesday morning during a hearing of the House Subcommittee on Consumer Protection and Commerce.
Entitled “Americans at Risk: Manipulation and Deception in the Digital Age,” the hearing’s main draw was the testimony of Monika Bickert, Facebook’s vice president of global policy and management.
Facebook recently unveiled its latest new policy prohibiting misleading and manipulated media, otherwise known as deepfakes.
Under this policy, the company has vowed to “remove videos that have been edited or synthesized using artificial intelligence or deep learning techniques in ways that are not apparent to an average person and that would mislead an average person to believe that the subject of the video said words that they did not say,” Bickert said.
Yet questions on the policy, its extent, and its enforcement remain.
“Facebook has a trust problem,” asserted Justin Hurwitz, an associate professor of law at the University of Nebraska, at the hearing,
Also testifying Wednesday were Joan Donovan, research director at Harvard Kennedy School’s Shorenstein Center on Media Politics and Public Policy, and Tristan Harris, executive director of the Center for Humane Technology.
Lawmakers expressed concern not only over digital misinformation, but about general deception and abuse. Subcommittee Chair Rep. Jan Schakowsky, D-Ill., suggested that over the last decade, the government’s “laissez-faire” approach to the topic of online data and digital manipulation has been “wholly inadequate,” pointing to a recent so-called ‘cheapfake’ video of Nancy Pelosi that went viral on Facebook.
A cheapfake is an altered audio or video, whereas a deepfake is entirely fabricated. Both deep- and cheapfakes can be made with simple, readily available technology, and both are part of the “emerging economy of misinformation,” according to Donovan, who warned of an impending “crisis of counterfeits.”
“If you can make it a trend, you can make it true,” said Harris, and “by not acting, we are subsidizing societal self-destruction.”
Facebook is “working proactively to remove harmful content,” argued Bickert, explaining how it uses both its community standards practices and a relationship with third-party fact-checkers to accomplish this.
Manipulated media violates Facebook’s Community Standards policies and is eligible to be both fact-checked and removed. An individual poster can protest this determination, and has the opportunity to amend the content, she explained,
“Speech, even when inaccurate, is protected,” said Rep. Greg Walden, R-Ore.
However, he appeared open to revision of Section 230 of the Communications Decency Act, which has been criticized for acting as a legal liability shield for third-party content on social networking sites like Facebook.
Modifying Section 230 may address concerns regarding nefarious targeting of race and/or regional location to disseminate political misinformation, particularly worrisome due to the upcoming campaign season.
Bickert pointed to training and tools provided by Facebook to help those most at risk realize when they are being targeted, as well as Facebook’s efforts to remove those networks and expose them publicly when they are identified.
But the question of whether it is an individual’s responsibility to educate themselves or the government’s responsibility to protect people from false information remains unresolved.
Hurwtiz cautioned that efforts to regulate “dark patterns” and corrupt persuasive design would likely have repercussions. He suggested that “the worst of this” already falls within the existing authority of the FTC. “We already have an agency that has power over this. Let’s see what it is capable of.”
Bickert said she welcomed the opportunity for Facebook to collaborate to develop a consistent industry approach and standards to cooperate in self-regulating online content.
She called attention to publicly available bi-annual reports showcasing media manipulation abuses on Facebook by amount and type. And, she said, Facebook’s numbers have been trending in a favorable direction.