Deepfake Porn Prompts Tech Tools and Calls for Regulations

Deepfake Porn Prompts Tech Tools and Calls for Regulations

It’s horrifyingly easy to make deepfake pornography of anyone thanks to today’s generative AI tools. A 2023 report by Home Security Heroes (a company that reviews identity-theft protection services) found that it took just one clear image of a face and less than 25 minutes to create a 60-second deepfake pornographic video—for free.

The world took notice of this new reality in January when graphic deepfake images of Taylor Swift circulated on social media platforms, with one image receiving 47 million views before it was removed. Others in the entertainment industry, most notably Korean pop stars, have also seen their images taken and misused—but so have people far from the public spotlight. There’s one thing that virtually all the victims have in common, though: According to the 2023 report, 99 percent of victims are women or girls.

This dire situation is spurring action, largely from women who are fed up. As one startup founder, Nadia Lee, puts it: “If safety tech doesn’t accelerate at the same pace as AI development, then we are screwed.” While there’s been considerable research on deepfake detectors, they struggle to keep up with deepfake generation tools. What’s more, detectors help only if a platform is interested in screening out deepfakes, and most deepfake porn is hosted on sites dedicated to that genre.

“Our generation is facing its own Oppenheimer moment,” says Lee, CEO of the Australia-based startup That’sMyFace. “We built this thing”—that is, generative AI—”and we could go this way or that way with it.” Lee’s company is first offering visual-recognition tools to corporate clients who want to be sure their logos, uniforms, or products aren’t appearing in pornography (think, for example, of airline stewardesses). But her long-term goal is to create a tool that any woman can use to scan the entire Internet for deepfake images or videos bearing her own face.

“If safety tech doesn’t accelerate at the same pace as AI development, then we are screwed.” —Nadia Lee, That’sMyFace

Another startup founder had a personal reason for getting involved. Breeze Liu was herself a victim of deepfake pornography in 2020; she eventually found more than 800 links leading to the fake video. She felt humiliated, she says, and was horrified to find that she had little recourse: The police said they couldn’t do anything, and she herself had to identify all the sites where the video appeared and petition to get it taken down—appeals that were not always successful. There had to be a better way, she thought. “We need to use AI to combat AI,” she says.

Liu, who was already working in tech, founded Alecto AI, a startup named after a Greek goddess of vengeance. The app she’s building lets users deploy facial recognition to check for wrongful use of their own image across the major social media platforms (she’s not considering partnerships with porn platforms). Liu aims to partner with the social media…

Read full article: Deepfake Porn Prompts Tech Tools and Calls for Regulations

The post “Deepfake Porn Prompts Tech Tools and Calls for Regulations” by Eliza Strickland was published on 07/15/2024 by spectrum.ieee.org