Skip to main content
Image
Building in New York

CONGRESSMAN JOE MORELLE ANNOUNCES NEW SUPPORT FOR HIS LEGISLATION TO STOP DEEPFAKE PORNOGRAPHY

April 24, 2024

Preventing Deepfakes of Intimate Images Act endorsed by:
National Organization for Women, RAINN, Joyful Heart Foundation

(Washington, DC)—Today, Congressman Joe Morelle announced a slate of national advocacy organizations have recently endorsed his legislation, the Preventing Deepfakes of Intimate Images Act, including: the National Organization for Women, Rape, Abuse & Incest National Network (RAINN),and the Joyful Heart Foundation.

The growing support comes during National Sexual Assault Awareness and Prevention Month as Congressman Morelle pushes for the legislation to be taken up for consideration by the House Judiciary Committee.

“We’ve seen the devastating impacts intimate deepfakes images have had on everyone from young schoolgirls to world-wide celebrities. We have a responsibility to take decisive action that puts a stop to these heinous crimes,” said Congressman Joe Morelle. “My bipartisan legislation, the Preventing Deepfakes of Intimate Images Act, would be a critical step forward in regulating AI and ensuring there are severe consequences for those who create deepfake images. I’m proud the bill has been gaining momentum with the support of these dedicated organizations, and I look forward to working together to swiftly pass it into law.”

The Preventing Deepfakes of Intimate Images Act will protect women from a new kind of thievery today—the theft of bodily imagery and autonomy,” said Christian F. Nunes, National Organization for Women (NOW) National President. “When this occurs to celebrities, it makes headlines, however 99.9 percent affected by deepfakes do not have the influence to fight back—or laws that protect them. This urgently needed legislation will provide legal fortitude to combat these egregious crimes. NOW thanks Congressman Morelle and the increasing number of cosponsors who are working to stop the real-life mental anguish and long-term damage transpiring daily in the digital world.” 

“As an organization dedicated to combating sexual assault and exploitation, RAINN enthusiastically endorses H.R. 3106 - the Preventing Deepfakes of Intimate Images Act,” said Karrie Delaney, Director of Federal Affairs, RAINN. “By criminalizing the intentional dissemination or threat of disseminating digitally manipulated sexually explicit content, this bill aligns with our mission to protect individuals from all forms of exploitation. We commend the efforts to address this pressing issue and urge its passage.”

“The non-consensual sharing of synthetic intimate images, often called deepfake pornography, is an exponentially growing form of abuse that humiliates, degrades, and threatens women and girls, causing real and lasting damage to their mental health and wellbeing. Current federal law does not prohibit this abuse, allowing these perpetrators to inflict this life-shattering violence on survivors with impunity,” said Ilse Knecht, Director of Policy and Advocacy for the Joyful Heart Foundation. “It's time for the Federal Government to pass legislation that recognizes this harm and allows survivors to seek justice. We urge Congress to quickly enact Representative Morelle’s Preventing Deepfakes of Intimate Images Act to support survivors of this alarming and widespread abuse and bring accountability to their abusers.” 

H.R. 3106, the Preventing Deepfakes of Intimate Images Act, takes much-needed action to prohibit the non-consensual disclosure of digitally altered intimate images. Notably, the legislation both makes the sharing of these images a criminal offense and creates a right of private action for victims to seek relief—a combination that serves as a powerful deterrent. The legislation is bipartisan and continues to steadily gain support in the House of Representatives, including over 55 co-sponsors.

The legislation was referred to the House Committee on the Judiciary in May of 2023, but has yet to receive formal consideration.Learn more about the Preventing Deepfakes of Intimate Images Acthere.

####