The world of AI has taken a dangerous turn and needs to be taken seriously. People are being taken advantage of through deepfake pornography.
Deepfake porn is when people edit and use AI programs to alter photos of others to make nonconsensual porn. It’s an issue that hurts people and can happen to anyone.
Singer-songwriter Taylor Swift became a victim after nude deep faked images of her went viral on X, and news coverage about deepfake porn has subsequently increased.
Another example is Bobbi Althoff, a podcaster who became a victim as well when deep faked nudes of her went viral on X.
Now, lawmakers are proposing legislation, the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 (DEFIANCE), which would allow victims to sue deepfake creators.
The introduction of this bill is a huge step in the right direction of protecting people from deepfake porn. This legislation, however, should include jail time, rather than simply a fine.
Deepfake porn is crazily accessible because it is only a Google search away. It’s a scary reality, and just because the images are AI-produced and fake doesn’t negate the negative impacts they can have.
David Gunkel, an NIU media studies professor, explained it is difficult to prevent this from happening because it’s easy to get pictures of people.
“It’s really quite difficult. I mean, anything that, because we all live on social media, we post images of ourselves constantly on social media, to share with our friends and our family,” Gunkel said. “Everyone who has content online is conceivably exposed to the reuse of their images for these kinds of product. And the only way to really steer clear of it is not ever upload any images of yourself to the internet.”
Because it’s online, it’s nearly impossible to completely eradicate pieces of content. 96% of deep fake videos are non-consensual pornography videos, according to a 2019 study by Deeptrace, a cyber-security company.
Deepfake porn also disproportionately affects women: 99% of people targeted by deepfake porn are women, according to Home Security Heroes.
This horrifying statistic should encourage us to speak out against deepfake porn. There is no place for it on the internet.
Deepfake porn is a complete invasion of privacy and can ruin people’s lives. It can show up when employers and universities are doing background searches, according to National Sexual Violence Resource Center. Victims can experience anxiety, PTSD, depression and other trauma.
Deepfake porn is a scary trend that needs to stop. Hopefully laws such as the DEFIANCE act of 2024 can help victims and stop deepfake pornography production.