The U.K.’s Ministry of Justice has announced that creating sexually explicit “deepfake” images will be made a criminal offense in England and Wales under a new law.
Under the legislation, anyone making explicit images of an adult without their consent will face a criminal record and an unlimited fine. If the image is then shared more widely, the creator could then be sent to jail.
The new law will mean creating sexually explicit deepfakes will be a criminal offense, even if there is no intention to distribute it. Simply creating such content with the aim of causing alarm, humiliation, or distress to the victim will constitute a criminal act.
🚫Sexually explicit deepfakes🚫
We’re creating a new offence so anyone deliberately making a sexually explicit ‘deepfake’ image or video of another person without consent can be prosecuted.
If these horrific images are shared, offenders could face jail.https://t.co/TBX7z0Epsu pic.twitter.com/GtvK3uw6cr
— Ministry of Justice (@MoJGovUK) April 16, 2024
Deepfakes uses AI and machine learning technologies to produce convincing and realistic videos, images, audio, and text showcasing events that never occurred. There has been a swathe of sexually explicit images of celebrities such as Scarlett Johansson and Taylor Swift.
The country’s Online Safety Act made it illegal to share AI-generated intimate images without consent in January. The Act also introduced further regulations against sharing and threatening to share intimate images without consent. The offense will be established by an amendment to the Criminal Justice Bill, which is currently progressing through Parliament.
Degrading. Violating. Dehumanising.
Sexually explicit ‘deepfakes’ – made without the consent of the victim – cause real-life trauma.
Presenter and campaigner @_JessicaDavies explains why a new law change criminalising their creation is so importanthttps://t.co/TBX7z0Epsu pic.twitter.com/lH78HYwVqg
— Ministry of Justice (@MoJGovUK) April 16, 2024
“The creation of deepfake sexual images is despicable and completely unacceptable irrespective of whether the image is shared,” said Laura Farris, the Minister for Victims and Safeguarding.
“It is another example of ways in which certain people seek to degrade and dehumanize others – especially women. And it has the capacity to cause catastrophic consequences if the material is shared more widely. This government will not tolerate it.
“This new offense sends a crystal clear message that making this material is immoral, often misogynistic and a crime,” she added.
How common are illegal deepfakes in the UK?
According to a Channel 4 News investigation, nearly 4,000 famous individuals were identified on the most visited deepfake websites, all of whom were victims of deepfake pornography.
In the first three quarters of 2023 alone 143,733 new deepfake porn videos were uploaded online – more than all the previous years combined.
Channel 4 News presenter, Cathy Newman, was among the victims. In her report, she responded to the video of her: “It feels like a violation. It just feels really sinister that someone out there who’s put this together, I can’t see them, and they can see this kind of imaginary version of me, this fake version of me.”
Malicious online deepfakes can have a real-world impact, as @MissCallyJane knows.
That’s why we’re cracking down on those who create sexually explicit deepfake images without consent – with offenders facing a criminal record and an unlimited fine.
More: https://t.co/TBX7z0Epsu pic.twitter.com/dhMv7eww7v
— Ministry of Justice (@MoJGovUK) April 16, 2024
Cally Jane Beech, a former Love Island contestant and campaigner against the proliferation of deepfakes, welcomed the move. She said: “What I endured went beyond embarrassment or inconvenience. Too many women continue to have their privacy, dignity, and identity compromised by malicious individuals in this way and it has to stop.”
Featured image: Canva
The post “UK to criminalize creation of sexually explicit deepfakes” by Suswati Basu was published on 04/17/2024 by readwrite.com