It would make as much sense, with all the strangling of CGI baboons...
Most of the laws that prohibit the creation and distribution of child sexual abuse imagery have been in place since the 1990s. Back then, Photoshop was in its infancy. The physical photographs that paedophiles shared were no less vile, but they were easier for the police to seize and destroy. Since then, technology has completely changed the way these people operate, making it possible to create and distribute horrifyingly life-like images and videos of children in seconds.
Yes, 'life-like'. It's not real. But maybe there are cases of real harm. Give us a 'for instance', will you?
A 15-year-old girl rang the NSPCC recently. An online stranger had edited photos from her social media to make fake nude images. The images showed her face and, in the background, you could see her bedroom. The girl was terrified that someone would send them to her parents and, worse still, the pictures were so convincing that she was scared her parents wouldn’t believe that they were fake.
And will this go down as 'a sexual assault of a child' in official government figures? I bet it will. Making the figures utterly worthless, since no-one's actually been assaulted.
That is why we are taking urgent action through a raft of new offences that will finally close the legal loopholes that paedophiles are exploiting to ruin young lives. If you are found in possession of a “paedophile manual”, you will now face years in jail. For the first time, we’ll imprison the people who are making the AI models that generate child sexual abuse material. Those who run or moderate websites where paedophiles share advice on how to groom children and avoid detection will spend a decade behind bars.
And where are you going to build all the extra jail space that will be needed, since they are actually at bursting point? Or will you simply let out genuine abusers to put away people getting naughty with pixels?
I'm torn with this one. The crime is sexual abuse of children but no children are harmed with AI images. So giving these perverts a way to play their hobby must prevent real abuse. I'm not a child molesting expert so may be off base here and willing to listen to experts. Isn't it the same principle for allowing druggies drugs that are legal and controlled.
ReplyDeleteNow the image mods of a real person. That is already a crime and should be prosecuted as such.
This is the first I've heard of a pedo manual. What with websites selling images and contact sites so they can share their hobbies it sounds like it is much much worse than I thought. Several decades ago it was illegal to be queer so they went underground except for a few. Now it is accepted and if you don't accept it you are a criminal. Won't be long before it is the same withe real child porn. It's already started in places like Rotherham where kids that reported it were fobbed off by our government. So this clampdown on artificial child porn makes no sense. Unless it is whitey that use the artificial stuff whilst others use the real ones with little punishment.
Our government really is despicable
If this does become law won't be long before Tommy Robinson is caught with something on his computer and people like Farage will just shrug. Tommy is a bad boy already.
Exactly! It's the very epitome of a non-crime.
DeleteNot everyone will be sent to prison. Nice nonces like those the judge saw on the TV will be okay, but if you are Far-Right you are in there for a long stretch.
ReplyDeleteGood point! It's not like we haven't seen it happen in real time, is it?
DeleteMaking fake nude images of real people can still cause harm if they're shared publicly. Fake images of fake people though, there's an argument that this could prevent actual people being abused
ReplyDeleteIndeed!
DeletePerhaps we could create some fake images of child molesters being castrated with a chainsaw! hey, its fake right? So no harm done!
ReplyDelete😏
DeleteUsual government distraction stuff. Presumably they hope everyone will start believing they are really tough on child sexual abuse and forget all about the total inactivity around grooming/rape/prostitution gangs. Sadly the compliant media will probably let them get away with it.
ReplyDeleteAnd if that doesn't work they can fall back on Operation Remove Points From Kitchen Knives! What an utter shambles they've turned out to be...
DeleteDo the PTB not even understand that every new smartphone and every new PC already come equipped with these "tools"? Or that the "tools" are available online for anyone to use?
ReplyDeleteNo, I suppose they don't; complete and utter ignorance is part of their personal "toolkit", after all.
Probably not, since all they seem to use their smartphones for is running WhatsApp groups that get them sacked! And they probably leave running a PC to their SPAD or teenage daughter.
DeleteTo back up what Bucko has suggested:
ReplyDeletehttps://www.researchgate.net/publication/49644341_Pornography_and_Sex_Crimes_in_the_Czech_Republic
The abstract says "prolonged interval during which possession of child pornography was not illegal and, like those other countries, showed a significant decrease in the incidence of child sex abuse." That's the outcome we want, right. An outcome where more bad people go to jail but there is more abuse is desired only by the spastic faction of the HateUKChildren party. Imv, of course.
Spot on!
Delete