Charting a New Domain: The Importance of Tackling Digital Sexual Violence
Note: Brittany Peng was a part of the National Women’s Law Center’s Gender Justice Youth Council and the views expressed below belong solely to the author and are not necessarily representative of NWLC. The author would like to give special thanks to Omny Miranda Martone, Founder & CEO of the Sexual Violence Prevention Association (SVPA), for contributing insights and references discussed throughout this blog post.
Within the last 30 years, our world has transformed.
The Internet of Things. Social media. Generative AI. We are more connected than ever.
However, with connection comes consequences. According to a 2021 report from the Institute of Development Studies, 16% to 58% of women and girls worldwide have experienced online gender-based violence. Younger women in Generation Z (born between 1997 and 2012) and Millennials (born between 1981 and 1996) are particularly impacted. As officially termed by UN Women, a program at the UN focused on gender justice, technology-facilitated gender-based violence (TFGBV), includes online sexual harassment, sexual exploitation, sextortion, image-based abuse (or non-consensual intimate images (NCII) abuse), revenge porn, deep fake pornography abuse, stalking, or doxing, amongst other acts.
Perpetrators of TFGBV can include individuals—like intimate partners and other people that we know—or groups of people who may be individually unknown to the victim. Nowadays, spoofed phone numbers, encrypted IP addresses, and fake profiles make it increasingly difficult to identify abusers. Yet, with just one click, a person’s privacy can be completely violated and their safety utterly compromised.
In the past few years, perpetrators of TFGBV have increasingly used nonconsensual “deepfake” images to harass, humiliate, and abuse their victims. In one survey, 1 in 8 young people aged 13 to 20, reported that they “personally know someone” who was the target of a deepfake, and 1 in 17 reported that they had been targets themselves. The harm caused by this type of abuse can be long lasting and devastating.
In response to this growing threat, Congress has moved to create protections and legal recourse for victims and survivors who are depicted in nonconsensual, sexually explicit deepfake images.
A bipartisan group of lawmakers is taking action to help ensure that perpetrators of this type of violence are held accountable. This May, these lawmakers introduced the DEFIANCE Act, a bill that would strengthen protections against nonconsensual, sexually explicit “deepfake” images and videos. Current federal law already provides protections against the nonconsensual disclosure of intimate images, but the DEFIANCE Act would extend these protections to “digital forgeries,” sexually explicit images created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means, which could include deepfake pornography, obscene audio spliced over a video, or photoshopped pictures.
The Act would provide a civil right of action so victims and survivors could seek recourse against perpetrators who create, distribute, publish, solicit, or knowingly possess digital forgeries. If successful, a court could award a victim monetary damages and other relief. The court could also order the deletion or destruction of the digital forgery.
Earlier this year, Congress passed the Take It Down Act, which criminalizes the publication of nonconsensual intimate imagery, including deepfakes. As part of the new law, websites are required to take down these images within 48 hours of receiving notice from a victim. The Take It Down Act, however, does not provide victims with the full range of relief included in the DEFIANCE Act.
While Congress must act to ensure that survivors of TFGBV have multiple avenues for accessing justice, more must be done to prevent the circulation of deepfakes in the first instance and international cooperation is necessary to help ensure broad accountability.
Technology-facilitated gender-based violence is a critical issue that demands comprehensive solutions for victims and survivors. The Sexual Violence Prevention Association notes that as of 2023, over 98% of deepfakes on the internet were pornography and that the number of deepfakes online is doubling every six months. Bipartisan action to create pathways for victims and survivors to seek justice is both necessary and already overdue.