Paedo who used AI to make vile child sex abuse pictures is jailed for 18 years in ‘deeply horrifying’ landmark case
A PAEDO who made thousands of pounds by creating child abuse images using AI and real pictures has been jailed for 18 years.
Hugh Nelson used a 3D character generator to transform normal, non-explicit, pictures of children into child abuse images.
The 27-year-old sold them on an internet forum where he charged paedophiles £80 for a new "character".
After that, it was £10 per image to alter them into different explicit positions.
Over 18-months, Nelson, from Bolton, made around £5,000.
Jeanette Smith, a specialist prosecutor for the CPS, said: "This is one of the first cases of its kind that demonstrates a link between people like Nelson, who are creating computer-generated images using technology, and the real-life offending that goes on behind that."
Read More
While paedophiles have jailed for using AI to create child abuse images, this is the first time cops have been able to link the images to real children.
Police arrested Nelson at his family home in Egerton in June last year.
He admitted to officers he had a sexual interest mainly in girls aged about 12.
He said on video: "I would say I am sexually attracted to some kids, but I've been completely swept up in it.
Most read in The Sun
"It's taken over my life, sort of thing. I can't remember how I got on that though.
"Real things like that, not long. I couldn't say exactly, a few months maybe. But the 3D images have been, I don't know, about - longer than that.
"I just kind of like fell into this pit of despair and absolute grotesque behaviour, and it just spiralled and spiralled and got worse."
Nelson went added: "There's this programme, which is what I've used to create these images.
"I've probably been doing it for about two years now. And I could probably say that they have got worse in nature as I've continued with them.
"It's sick how much it affects your mind, especially when you have no job, you sit at home, you play games, you watch porn and you make theses stupid goddam images.
"My mind is very corrupted and warped.
"It can just be images of them posing, fully clothed, to hardcore rape images. So everything really.
"The images that I sent, I'm not sure you would classify them as any restricted image because they're not sexual in nature at all.
"They're just used as reference to create a 3D model from them."
In August, Nelson pleaded guilty at Bolton Crown Court to 11 offences, including three counts of encouraging the rape of a child under 13, one count of attempting to incite a boy under 16 to engage in a sexual act, three counts each of the distribution and making of indecent images, and one count of possessing prohibited images.
At an earlier court appearance he also admitted to four counts of distributing indecent pseudo photographs of children and one of publishing an obscene article.
The children who Nelson was sent pictures of were all from France, Italy and the United States.
The relevant authorities in those countries have been made aware and more arrests have been made.
Detective Constable Carly Baines, from Greater Manchester Police, said the case was "deeply horrifying".
She said: "It became clear to us after extensive trawls of his many devices by digital forensic experts however, that his behaviour went far beyond what clearly he was seeing as a 'business opportunity'.
"Not only was he creating and selling these images, but he was engaging in depraved sexualised chat online about children and going as far as to encourage people interested in his online content to commit contact offences such as rape against children they knew or were related to."
Experts at the Internet Watch Foundation in Cambridge say they have found more child abuse pictures made using AI in the last six months than they did across the whole of last year.
Dan Sexton, the charity's chief technical officer, said: "Our work has always been difficult anyway.
READ MORE SUN STORIES
"[But] we've never had to deal with the possibility that someone could download some software on their computer and create an infinite amount of new images.
"They use as many as they can until the hard drives fill up. That's a new type of harm that which we have not been prepared for."