The Alarming Surge of AI-Generated Deepfake Porn Raises Concerns for Privacy and Dignity

When Gabi Belle Discovered Deepfakes: A Disturbing Trend on the Rise

YouTube sensation Gabi Belle had an unsettling experience when she learned that a naked photo of her was circulating on the internet. Her shock was not just because the image was fake but also because she had never posed for such a picture. The photo depicted her standing nude in an open field, a clear fabrication.

When Belle, 26, reached out to a colleague for help in removing the image, she was told that there were nearly 100 similar fake photos of her scattered across the web. Most of these images found their home on websites notorious for hosting AI-generated porn. Although some of these fake photos were taken down in July, new ones depicting Belle in explicit sexual scenarios had already emerged.

“I felt yucky and violated,” Belle said during an interview. “Those private parts are not meant for the world to see because I have not consented to that. So it’s really strange that someone would make images of me.”

An Unprecedented Boom in AI-Generated Fake Porn

Artificial intelligence has ushered in an alarming surge in fake pornographic images and videos. This boom can be attributed to the rise of affordable and easy-to-use AI tools that can “undress” individuals in photos by analyzing their clothed bodies and superimposing a nude form into an image. Additionally, these tools can seamlessly swap faces into explicit videos.

According to industry analyst Genevieve Oh, the top 10 websites hosting AI-generated porn photos have seen a staggering increase of over 290% in fake nudes since 2018. These sites feature not only celebrities and political figures, like New York Rep. Alexandria Ocasio-Cortez, but also ordinary teenage girls whose likenesses have been exploited by malicious actors to incite shame, extort money, or live out private fantasies.

Victims of AI-Generated Deepfakes Left with Little Recourse

Regrettably, victims have limited legal recourse in addressing this issue. Currently, there is no federal law specifically governing deepfake porn, and only a handful of states have enacted regulations. President Biden’s AI executive order, issued recently, recommends, but does not require, companies to label AI-generated photos, videos, and audio to indicate computer-generated work.

Moreover, legal scholars have raised concerns that AI-generated images may not fall under copyright protections for personal likenesses, as they draw from vast data sets populated by millions of images. “This is clearly a very serious problem,” said Tiffany Li, a law professor at the University of San Francisco.

Women and Teens at Particular Risk

The advent of AI-generated deepfake images poses a particular risk to women and teenagers, many of whom are unprepared for such visibility. A study by Sensity AI in 2019 found that 96% of deepfake images are pornography, and 99% of those photos target women. Sophie Maddocks, a researcher and digital rights advocate at the University of Pennsylvania, noted that “It’s now very much targeting girls, young girls, and women who aren’t in the public eye.”

AI Tools Enable Nonconsensual Creation of Fake Nudes

AI tools like “nudifiers” use real images to create fake nude photos, flooding the internet in recent months. By analyzing millions of images, AI software predicts how a body will look without clothing and smoothly inserts a face into a pornographic video. Open-source software, like Stable Diffusion, has made its code public, allowing amateur developers to adapt the technology for potentially nefarious purposes.

Once these apps go public, they often utilize referral programs to encourage users to share AI-generated photos on social media in exchange for monetary rewards, according to Genevieve Oh.

The Expanding Realm of AI-Generated Deepfake Porn

When Oh examined the top 10 websites hosting fake porn images, she found that more than 415,000 such images had been uploaded in the current year, amassing nearly 90 million views. Additionally, AI-generated porn videos have proliferated across the web, with over 143,000 videos added in 2023. This number surpasses the total of new videos uploaded from 2016 to 2022, with these fake videos accumulating more than 4.2 billion views.

Rising Cases of Sextortion and Impacts on Victims

The Federal Bureau of Investigation warned in June about the increasing cases of sexual extortion, with scammers demanding payment or explicit photos in exchange for not distributing sexual images. While the exact percentage of these images that are AI-generated remains unclear, the practice is growing. As of September, over 26,800 people have fallen victim to “sextortion” campaigns, marking a 149% increase from 2019, according to the FBI.

A Forum Dedicated to AI-Generated Deepfakes

In May, a poster on a popular pornography forum initiated a thread called “I can fake your crush.” The concept was straightforward: “Send me whoever you want to see nude, and I can fake them” using AI, as mentioned by the moderator. Photos of women began pouring in, as posters requested fake nude images of individuals not in the public eye, such as co-workers and neighbors.

The Role of Google and the Need for Enhanced Protections

Celebrities are a popular target for fake porn creators, often seeking to capitalize on the demand for nude photos of famous actors. Websites featuring famous individuals frequently lead to an influx of other types of fake nudes, including “amateur” content from unknown individuals. These sites also feature ads marketing AI porn-making tools.

Google has policies in place to prevent nonconsensual sexual images from appearing in search results, but its safeguards against deepfake images are less robust. Deepfake porn and tools for creating it can appear prominently on the company’s search engines, even without specific searches for AI-generated content.

Ned Adriance, a spokesperson for Google, stated that the company is actively working to enhance search protections. Google allows users to request the removal of involuntary fake porn.

Li, from the University of San Francisco, pointed out the difficulty of penalizing creators of this content due to Section 230 in the Communications Decency Act, which shields social media companies from liability for the content posted on their sites. Furthermore, victims may find it challenging to claim content as solely derived from their likeness since AI draws from a vast dataset of images.

Efforts to Regulate AI-Generated Images and Videos

While the push to regulate AI-generated images and videos is mainly aimed at preventing mass distribution and addressing concerns about election interference, these rules do little to protect individuals affected by deepfake porn. In the absence of federal laws, at least nine states, including California, Texas, and Virginia, have passed legislation targeting deepfakes. However, these laws vary in scope, with some allowing victims to press criminal charges, while others permit only civil lawsuits, which can be difficult to pursue.

While the legal landscape surrounding deepfake porn remains uncertain, the rise of AI-generated explicit content poses a severe risk to individuals’ privacy and reputation. The lack of federal regulations and limited legal recourse underscore the urgent need for stronger rules and protections. NudeKnow’s facial recognition software is one step towards addressing this pressing issue, offering hope to victims like Gabi Belle and countless others who seek to regain their privacy and dignity.

In a time where women continue to be disproportionately affected by this problem, NudeKnow’s technology stands as a promising tool to combat AI-generated deepfake porn and ensure a safer digital environment.

 

The call for stronger rules and protections is growing, with victims like Gabi Belle highlighting the pressing need to address the issue of AI-generated deepfake porn. “You’re not safe as a woman,” Belle emphasized.

Similar Posts

Leave a Reply