Children creating AI nudes and sharing them with friends

Artificial Intelligence apps can create lifelike pornographic images
Artificial Intelligence apps can create lifelike pornographic images - TON PHOTOGRAPH/ISTOCKPHOTO

Schoolchildren are downloading AI apps specifically designed to create nudes, a government report has warned.

They are also sharing the images illegally with their friends, experts say, while organised crime gangs are using the apps to blackmail children with computer-made naked images.

A report from the Department for Science, Innovation and Technology (DSIT) was authored by the UK Council for Internet Safety, which includes tech giants, charities, government departments and regulators, to advise teachers on how to deal with the issue of pupils sharing nudes or semi-nudes.

The document was recently updated to include AI-generated images, deepfakes and the topic of “sextortion”.

Education workers are told to deal with AI-made images in the same way as normal nudes, which includes not looking at them, not deleting them, calling the police, and not immediately informing parents.

The guidance has been welcomed by campaigners and experts, but there are calls for Ofcom, the regulator of the Online Safety Act 2023, to be more “proactive and comprehensive” in cracking down on AI nudes.

The Government is also being encouraged to change the law so that it is illegal to make any AI nudes or deepfake porn.

Illegal images

It is an offence to share deepfake porn or AI nude images of an adult without consent. It is also an offence to threaten to do so. However, the creation of AI nudes is not illegal, if of an adult.

High-profile women, such as singer Taylor Swift, have had deepfake nudes shared online and the topic has been raised in the Commons and in the House of Lords.

However, it is unlawful to create, own or share nude or sexual images of children, which includes AI photos, as it constitutes child sexual abuse.

The law has criminalised the consensual sharing of nudes among under-18s, It is also illegal for them to make and share AI nudes of their classmates.

“A young boy using a nudify app to create a nude is committing an offence of creating/possessing child sexual abuse imagery, even if of himself,” said Durham University’s Prof Clare McGlynn, an expert in the regulation of pornography.

‘Common in schools’

Experts are warning that the ease with which people can access AI nude-making apps is driving a rise in virtual sexual exploitation of women and girls.

Prof McGlynn said: “AI deepfake abuse is now common across schools in two main circumstances: (a) boys using nudify apps to create fake nudes of their female classmates and then trade and share them, and (b) organised scammers creating deepfake nudes and threatening to share those images unless payment is made.

“Dealing with the impact of the scammers requires greater openness about the prevalence of this abuse and that young people should not feel guilty about being victimised, and should come forward to report what has happened.

“However, many fear being blamed for, say, accepting a person as a friend online, when it turns out that that person is a scammer.”

She called for a more “comprehensive and proactive” approach in response to the AI apps making the obscene images, and said Ofcom was “weak” at combatting this form of abuse.

Sextortion on the rise

The report warns that there has been a “significant increase” in sextortion in the past couple of years, fuelled by the use of AI apps.

A Government spokesman said: “Every child should be safe online. UK law is clear that it is an offence to produce, store, share or search for material that contains or depicts child sexual abuse, which includes AI-generated child sexual abuse material.

“Our trailblazing Online Safety Act has the strongest protections in place for children and has also made sharing deepfake, intimate images of another person without consent a criminal offence.

“The Act has placed groundbreaking new duties on social media platforms to stop illegal content being shared on their sites, or they risk facing fines that could reach billions of pounds.

“Building on this action, our guidance for schools sets out how they should respond to any report of child-on-child sexual violence or sexual harassment, including the circulation of indecent images, whether they are generated by artificial intelligence or not.”

Peter Kyle, Labour’s shadow science secretary, is reportedly considering banning nudification apps and deepfake technology.

Susie Hargreaves, chief executive of the Internet Watch Foundation, said: “The sophistication of AI and the ease with which it can now be used pose very real threats for children, and I am pleased to see this reflected in the guidance.

“If an image realistically depicts child sexual abuse, whether it is AI or not, it makes the internet a less safe place. That real children’s imagery is also being manipulated to create sexual abuse imagery of recognisable children is an appalling threat everyone should be aware of.

“Criminals only need a handful of non-sexual images to create lifelike sexual abuse imagery. The potential for criminals to use this imagery as a sextortion tool to blackmail children into a spiral of further abuse is a terrifying and heart-breaking prospect, and one we must all be taking seriously.”

Advertisement