The creation of sexually explicit deepfake content is likely to become a criminal offense in England and Wales as concern grows over the use of artificial intelligence to exploit and harass women.
Under a draft law, anyone who creates such an image or video of another adult without their consent — even if they don’t intend to share it — would face a criminal record and an unlimited fine, the UK justice department announced Tuesday. It is already illegal in England and Wales to share explicit deepfakes without the subject’s consent, with perpetrators facing jail time.
Deepfakes are images or videos that have been manipulated, often using AI, to give the impression that someone has done or said something that they have not.
Laura Farris, the United Kingdom’s Minister for Victims and Safeguarding, told ITV Tuesday that “to the best of (her) knowledge,” the two countries within the UK would be the first anywhere in the world to outlaw the creation of sexually explicit deepfakes.
Under the draft law, such content would include both pornographic images and nude deepfakes, whether or not the subject is engaging in erotic behavior.
The devolved governments of Scotland and Northern Ireland are responsible for passing relevant laws in those countries. They did not immediately respond to CNN’s question about whether they planned to introduce equivalent rules.
The new offense in England and Wales will be introduced through an amendment to the Criminal Justice Bill, a law currently going through parliament. Last year, changes to the Online Safety Act already criminalized the sharing of deepfake sexual images in England and Wales.
The new offense applies only to adults as, under existing English and Welsh rules, creating deepfake sexual images of minors is already a crime.
The creation of deepfakes has included super-imposing women’s faces, without their consent, onto sexually explicit images. Taylor Swift is perhaps the highest-profile victim of this practice. In January, such images of the singer circulated on X, receiving tens of millions of views before being taken down.
That month, a bipartisan group of lawmakers in the United States introduced a draft civil law that, if passed, will allow the victims of sexually explicit deepfakes to sue the people who create and share such content without their consent.
A directive criminalizing the creation of sexually explicit deepfakes has also been proposed in the European Union. If the rule is passed, the bloc’s 27 member states will have to create corresponding national laws.
Farris, the UK minister, said in a statement that deepfakes were examples of the “ways in which certain people seek to degrade and dehumanize others — especially women.”
“This new offense sends a crystal-clear message that making this material is immoral, often misogynistic, and a crime,” she added.
Meta (META), the owner of Facebook and Instagram, said Tuesday that it would review how it handles deepfake pornography after two explicit, AI-generated images of female public figures in the United States and India circulated on its platforms.
“Deepfake pornography is a growing cause of gender-based harassment online and is increasingly used to target, silence and intimidate women — both on and offline,” Meta Oversight Board Co-Chair Helle Thorning-Schmidt said in a statement.