Creating sexually explicit “deepfake” images to become a crime in the UK
The creation of sexually explicit “deepfake” images is to be made a criminal offence in England and Wales.
Deepfakes are created using artificial intelligence (AI) to make a photo or video of someone by manipulating their face or body.
In recent years, many have imposed the faces of celebrities into pornographic films or images, as happened to Taylor Swift earlier this year.
Now, the government says a new law will be created and anyone making explicit images of an adult without their consent will face a criminal record and an unlimited fine. The Ministry of Justice said this will apply regardless of whether the creator of the image intended to share it.
Under the Online Safety Act, which was passed last year, the sharing of deepfakes was made illegal.
The new law will make it an offence for someone to create a sexually explicit deepfake – even if they have no intention to share it but “purely want to cause alarm, humiliation, or distress to the victim”, The Ministry Of Justice said (via BBC).
The law will be introduced as an amendment to the Criminal Justice Bill, which is currently making its way through Parliament.
Minister for Victims and Safeguarding Laura Farris said it would send a “crystal clear message that making this material is immoral, often misogynistic, and a crime.”
“The creation of deepfake sexual images is despicable and completely unacceptable irrespective of whether the image is shared,” she added.
“It is another example of ways in which certain people seek to degrade and dehumanise others – especially women. And it has the capacity to cause catastrophic consequences if the material is shared more widely. This Government will not tolerate it.”
Following what happened to Swift earlier this year, numerous US politicians said there was need for legislation to catch up with advancements in AI technology.
There are currently no federal laws in the US prohibiting the sharing or creation of deepfakes, but some states have made advancements in creating legislation to tackle the issue.
US Representative Joe Morelle called the spread of the pictures “appalling” and encouraged urgent action to be taken.
He said the images and videos “can cause irrevocable emotional, financial, and reputational harm – and unfortunately, women are disproportionately impacted”. Morelle had also been involved with the proposed Preventing Deepfakes of Intimate Images Act, which would have made it illegal to share deepfake pornography without consent.
“What’s happened to Taylor Swift is nothing new,” Democratic Rep Yvette D Clarke posted on X, adding that women had been targeted by the technology “for years”, adding that with “advancements in AI, creating deepfakes is easier & cheaper”.
Republican Congressman Tom Kean Jr agreed, saying that it is “clear that AI technology is advancing faster than the necessary guardrails”.
“Whether the victim is Taylor Swift or any young person across our country, we need to establish safeguards to combat this alarming trend,” he added.
The CEO of Microsoft also spoke out against the deepfakes of Swift.
Satya Nadella said in an interview with NBC News: “First of all, absolutely this is alarming and terrible, and so therefore yes, we have to act, and quite frankly all of us in the tech platform, irrespective of what your standing on any particular issue is — I think we all benefit when the online world is a safe world.”
“I don’t think anyone would want an online world that is completely not safe for both for content creators and content consumers. So therefore I think it behooves us to move fast on this.”
Social media platform X temporarily blocked Swift‘s name from being searched for time after the images went viral.
The graphic AI images of Swift were also reportedly linked to a 4chan chatroom challenge.
The post Creating sexually explicit “deepfake” images to become a crime in the UK appeared first on NME.
Elizabeth Aubrey
NME