The entire concept of being sexually attracted to children is revolting, but I find the concept of criminalizing cartoons or AI generated images perplexing.
The whole legal theory behind criminalizing child porn is that the subjects are minors, can't consent, and are therefore victims. AI-generated pictures aren't people and can't be victims.
Barring evidence that viewing fake kiddie porn highly correlates with the person moving on to the real stuff, I'm not seeing the societal benefit to criminalizing this. Yes it's gross and perverted, but that's not the benchmark by which we define what is legal and what is not.
These people are wired wrong, and they can't be fixed, so I'd much rather that they sat in their house satisfying their urges by jerking off to fake kids than messing with real ones.
of criminalizing cartoons or AI generated images perplexing.
Criminals escalate. If you don't stop the small crimes they graduate to larger ones.
I'm not seeing the societal benefit to criminalizing this.
It's the same reason we're pretty harsh about abusing animals. That behavior only leads one direction. It's not worth caring about people who do this and then will stop on their own. The punishment works EITHER WAY.
and they can't be fixed
Yea but they can be so afraid of being put in prison that they keep in their pants all the way into the grave.
When has the legislation "escalated?" It's pretty simple, you can't touch kids inappropriate, can't take naked pictures of them, and you can't draw and then SHARE naked pictures of them.
There's no slippery slope here. More civil damage was done after 9/11 than any of these cases. You're clutching at straws.
AI child pornography? Imagine being that much of a predator you have to make your own porn.
Also, of course this guy is the literally anti-right wing stereotype lmao.
The entire concept of being sexually attracted to children is revolting, but I find the concept of criminalizing cartoons or AI generated images perplexing.
The whole legal theory behind criminalizing child porn is that the subjects are minors, can't consent, and are therefore victims. AI-generated pictures aren't people and can't be victims.
Barring evidence that viewing fake kiddie porn highly correlates with the person moving on to the real stuff, I'm not seeing the societal benefit to criminalizing this. Yes it's gross and perverted, but that's not the benchmark by which we define what is legal and what is not.
These people are wired wrong, and they can't be fixed, so I'd much rather that they sat in their house satisfying their urges by jerking off to fake kids than messing with real ones.
Criminals escalate. If you don't stop the small crimes they graduate to larger ones.
It's the same reason we're pretty harsh about abusing animals. That behavior only leads one direction. It's not worth caring about people who do this and then will stop on their own. The punishment works EITHER WAY.
Yea but they can be so afraid of being put in prison that they keep in their pants all the way into the grave.
So does legislation. That's usually more harm than good.
When has the legislation "escalated?" It's pretty simple, you can't touch kids inappropriate, can't take naked pictures of them, and you can't draw and then SHARE naked pictures of them.
There's no slippery slope here. More civil damage was done after 9/11 than any of these cases. You're clutching at straws.