Hugh Nelson, 27, from Bolton, jailed after transforming normal pictures of children into sexual abuse imagery

A man who used AI to create child abuse images using photographs of real children has been sentenced to 18 years in prison.

In the first prosecution of its kind in the UK, Hugh Nelson, 27, from Bolton, was convicted of 16 child sexual abuse offences in August, after an investigation by Greater Manchester police (GMP).

Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life.

He was also found guilty of encouraging other offenders to commit rape.

  • FourPacketsOfPeanuts@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    3 days ago

    Any sexual representation of a child is illegal in the UK whether it looks real or not. In fact I believe it doesn’t need to even be a child, it’s a illegal if a reasonable person would believe it was depicting a child. This came up when adults who were into age play got into trouble distributing their images because it looked convincingly underage.

    • Jake Farm@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      26
      ·
      3 days ago

      Wait so even if the subjects are adults in costume its illegal? Fuck man, school uniforms is a whole genre of porn.

    • AmidFuror@fedia.io
      link
      fedilink
      arrow-up
      16
      ·
      3 days ago

      And I suppose we can rely on the courts to know sexual when they see it, so people don’t get in trouble for taking pictures of cherubs at the Louvre.

      • FourPacketsOfPeanuts@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 days ago

        Nice try lol, non-sexualised nudity is not illegal. UK law has a degree of common sense about it. A stick figure, even mildly sexualised, is unlikely to pass the test for indecency. Having said that, if someone drew some sort of extreme circumstance then, I don’t know for sure, but I can imagine someone getting into shit about it.

    • cygnus@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 days ago

      Thanks for clarifying, I didn’t know that. Seems like a bit of an overreach to me, but I suppose in this particular case it’s best to err on the side of caution.