The rise of artificial intelligence technology has brought about a disturbing trend – the creation of fake nude images of real people on “nudify” sites. In a recent episode of “60 Minutes”, victims spoke out about the real harm caused by these AI-generated images.
One teenage victim shared her experience of having her face superimposed onto explicit images without her consent. She expressed the emotional distress and humiliation she felt as a result of these fake nudes circulating online. Despite reporting the images to social media platforms and law enforcement, not much has been done to stop their spread.
The implications of this technology go beyond just the victims themselves. It raises concerns about privacy, consent, and the potential for deepfake videos to be used for malicious purposes such as blackmail or revenge porn.
As the issue of fake nudes created by AI continues to grow, it is crucial for lawmakers, tech companies, and society as a whole to address and combat this harmful phenomenon. Watch the full episode of “60 Minutes” for a closer look at this disturbing trend and its impact on real people.
Watch the video by 60 Minutes
About 60 Minutes
“60 Minutes,” the most successful television broadcast in history. Offering hard-hitting investigative reports, interviews, feature segments and profiles of people in the news, the broadcast began in 1968 and is still a hit, over 50 seasons later, regularly making Nielsen’s Top 10.
Video “Fake nudes created by AI “nudify” sites are causing real harm, victims say | 60 Minutes” was uploaded on 12/16/2024 to Youtube Channel 60 Minutes
This is sad
Lord, how much worse can this sick world get… ??
Hurry, Jesus, we are anxiously awaiting your return…
Hell, the movies, Hollywood has done this for years!!!
I’m sure ex-Rep. Matt Gaetz was quick to act on this for his State of Fl. Filled his photo album!!😳😳😳
Be trippy if AI started the site
Artificial INTELLIGENCE.. THE NEW OXYMORON
Isn't having a nude of a minor in your possession a felony?
They should arrest whoever created that AI site!!! Why di they make sites to hurt other people??
I wonder if the school officials had it happen to them, if there would have been only a 1 day suspension?
We "USA" just seem to have a new way of handling enemies, that is we sit around on our hands until something catastrophic happens. After the damage is done, we "TALK" about what we should do. This Government needs to stop EFFIN around with other countries or we aren't going to be here for much longer.
And it's only gets worse when you find your face on onlyfans without your knowledge, consent, nor pay accounts.
I'm shocked, shocked that the school would keep information about this scandal secret from See B.S. It's almost like they watched the highly edited Kamela Harris interview on See BS and thought, "You know, we could just claim confidentiality or whatever and clam up about it. It worked for Kamela and See B.S., so why not us?"
5:31 wow… so… wow 😂😂😂
These tech companies should be obliged to pay millions of dollars in damages to these victims. Section 230 should be repealed, that is the only way to control this Internet chaos.
was expecting at least, an exposure of the people behind these sites.
"If I asked people what they wanted they would have said images of naked horses" -Henry Ford. This tech has been around for decades with photoshop or just sketching a nude on a piece of paper. Stop trying to pretend you can stop it or that is is doing any real harm. It isn't. If anything it protects your irresponsible children who are sharing real nude photos with their partners because now they can claim they are fake when their ex posts them online.
Gee, perhaps this is all a very simple reaction of how religion has broken people's minds, by treating nudity as something horrific. After all, if their supreme being created human bodies, then why did he make us so afraid of having anyone see our bodies? Makes no sense.
This is some real Pandora's box stuff that's unfolding…
We might need laws to make mass "AI nudify" services illegal.
@60minutes, this is all very reactive – from your investigation to the legal counsel you sought advice from. What your investigators should be looking for is what AI capability they are using to generate these images and stop these from ever being generated. I can't even see a world where generating nudes and sexually suggestive images is productive in any ethical society.
Oh no people are going to loose money to computers.
How does one do dis lol
Very sad to hear!
AI misuse is only going to grow. Severe penalties and implementation of the same by law enforcement, is imperative!
Listening to how poorly the school handled this makes me mad. Mad enough to suggest that the wronged girls do the same to pics of these sick-minded boys and in fact ensure the exposed parts are not very complimentary!!
What then, boys?
An episode of FBI covered this recently
Anderson Cooper is an extreme leftist and a beta male so I don't really care what he says. Good documentary though.
I was unable to find what 60 minutes is talking about.
Once the boys turn adults, I would continuously plaster their names and this story all over social media so people know what they are and who they are.
Schools should not be responsible nor punish kids for what the kids do on their own time!
Very sad beautiful strong little girl I pray she gets peace and knows she’s beautiful and did nothing wrong
Ah, I thought the mother sounded Polish.
They were fake images so even if they were deleted new ones could just be made
GALIAAAAAAAAAAA
Yeah so disgusting and horrible the things that can happen these days.
As a graphic designer AI art has been unbearable lately, i've been using a chrome extension called Pre-AI Search that just filters to pre-2023 results and finally can get actual reference designs again. Only way I can work properly now lol not sure what that says about future of the internet
Being a graphic designer right now is ridiculous with AI art flooding search results. Can't look up a single reference without drowning in AI-generated content. I've had to filter to pre-2023 google using this chrome extension Pre-AI Search. The fact i need to time travel to find real designs is insane 💀
But which one works best ?
The original site seems unavailable, but they launched another version under a net domain
And no one is surprised
Use a.i. to reverse the image back, even though the damage is done.😢
From fables to folklores that were handed down through the centuries, humanity had sinned since the moment Adam and Eve felt ashamed of their nakedness. They started to cover their bodies with fig leaves in the hope it could hide their sin and shame. 😑
We’re currently working on taking it down
Additional comment second watching. I would ask the unfortuane young ladies if they know who the boys are who did this. If so pass the word around to all the young girls to not go out with them and to stay away from them. I bet later in life these young boys will be somewhat ashamed of what they have done. If there are social consequences for this, that would send a message to other boy,s to think twice before doing anything like this. I would urge the young women to make absouletly sure of who they are first.
Its unfair to interview victim . Put boy or perpetrators or miscreant interview
You went to Argentina to knock on a door, but that's not who's receiving the payments at the bank.
Follow the money trail.