In April 2024, the government introduced new rules aiming to regulate the rise in Artificial Intelligence-powered ‘deepfake’ platforms. Under new laws, it is now a criminal offence to create intimate or sexually explicit images of adults without their consent, and remains a criminal offence to use AI to create, possess or distribute indecent images of children.
What are ‘nudify apps’?
‘Nudify’ apps and websites are online platforms that allow users to generate nude images. They typically work by inputting an existing image of a person, which is then manipulated by AI technology to make them appear undressed or in an otherwise compromising, intimate or explicit position. Realistic images or videos generated using AI are called ‘deepfakes.’
Whilst mainstream AI services usually operate with extensive detection software that prevents users from creating explicit or intimate images, there are many other platforms that can be accessed via the mainstream and Dark internet that do not have these safeguards. In many cases, these platforms are designed and operate purely for the creation of sexually explicit images. These are called ‘nudify’ apps.
What does the law say?
Under the Data (Uses and Access) Act 2025, it is a criminal offence for any individual to create sexually explicit images of another person without their consent. This applies even if the image has not been shared, and if the defendant has asked another person to create this image on their behalf. These regulations strengthened existing rules introduced under the Online Safety Act 2023, which amended the Criminal Justice and Courts Act 2015 to prohibit the sharing of sexually explicit deepfakes or their use to extort or humiliate their subjects.
Defendants convicted of creating explicit sexual images of others without their consent can be handed an unlimited fine and be listed on the Sex Offenders Registry. Those who share sexually explicit deepfakes or use them to extort or humiliate others can be handed a maximum prison sentence of two years. Due to the structure of the law, it is possible for defendants to be charged with both creating and sharing explicit deepfake images, which can lead to a more severe sentence.
It is already a criminal offence under the Protection of Children Act 1978 to create, distribute or possess indecent images of children, including images ‘made by computer-graphics or otherwise howsoever which appears to be a photograph.’
However, the government has said that it will strengthen this law through new restrictions that appear in the ongoing Crime and Sentencing Bill 2025, which will explicitly prohibit the creation, possession and distribution of AI-generated child sexual abuse material. Under this proposed law, existing child sexual abuse statutes will be amended, with offenders facing a maximum of five years’ imprisonment for indecent images offences involving AI.
Who is at risk?
In addition to genuine victims of distress and extortion caused by sexually explicit deepfakes, it is also possible to inadvertently commit or be falsely accused of these new crimes.
PCD Solicitors have represented several clients who have been charged with the creation of proscribed deepfake images. In one instance, a client was charged after producing several consensual deepfake images of their partner, who was reported to the police when their relationship broke-down.
In other cases, our firm has supported children under the age of 18 who have been investigated for using deepfake technology to create explicit images of their classmates, friends, and teachers. Whilst these images were created naively due to sexual curiosity or for a humorous effect, these actions are prohibited under the law.
PCD Solicitors are a nationwide criminal defence firm, specialising in defending and appealing false allegations of sexual crimes. If you or a relative has been accused of a sexual crime involving 'nudify' apps or deepfake technology, you can contact our team here.