AI-generated child sex imagery has every US attorney general calling for action::“A race against time to protect the children of our country from the dangers of AI.”
AI-generated child sex imagery has every US attorney general calling for action::“A race against time to protect the children of our country from the dangers of AI.”
How about addressing my points instead of the ad hominem attacks?
Like I said: “I’d personally be very hesitant to ban/persecute stuff like that unless there was actual evidence that it was harmful”
If what you’re saying here is actually true then the type of evidence I mentioned would exist. I kind of doubt it works that way though. If you stop “feeding” being straight, gay, whatever, does it just go away and you no longer have those sexual desires? I doubt it.
Much as we might hate it that some people do have those urges, it’s the reality. Pretending reality doesn’t exist usually doesn’t work out well.
I never said any such thing. Also, in this case, we’re also talking about images that resemble children, not actual children.
It should be very clear to anyone reading I’m not defending any kind of abuse. A knee-jerk emotion response here could easily increase the chances children are abused. Or we could give up our rights “for the children” in a way that doesn’t actually help them at all. Those are the things I’m not in favor of.
I’m not the guy you’re replying to, but I will say this is a topic that is never going to see a good consensus, because there are two questions of morality at play, which under normal circumstances are completely agreeable. However, when placed into this context, they collide.
Pornography depicting underage persons is reprehensible and should not exist
The production and related abuse of children should absolutely be stopped
To allow AI child porn is to say that to some extent, we allow the material to exist, even if it depicts an approximation of a real person whether they are real or not, but at the potential gain of harming the industry producing the real thing. To make it illegal is to agree with the consensus that it shouldn’t exist, but will maintain the status quo for issue #2 and, in theory, cause more real children to be harmed.
Of course, the argument here goes much deeper than that. If you try to dig into it mentally, you end up going into recursive branches that lead in both directions. I’m not trying to dive into that rabbit hole here, but I simply wanted to illustrate the moral dilemma of it.
So we should ban books like Lolita since it can be interpreted as porn, or is it only visual that should be banned? If books are okay, are an imahe of stick figures with a sign “child” okay? How much detail should the visual image have before it gets banned?
How about 1000 year old dragons in a child’s body? How about images of porn stars with very petite bodies?
That is addressing your point. These people need to get psychological help.
The harms brought by conversion therapy to the gay and straight people outweigh the harms that are brought about by allowing them to exist. The same is not true of pedophilia. Though it is interesting if you do see these as the same, are you for the persecution of gay or straight people as you are pedophiles, or are you in favour of pedophiles being able to enact their desires?
It is the reality, and pretending people will just safely keep their desires to themselves has proven to not work.
I never said you said it, but it is the result of what you’re saying.
Since you’re drawing this distinction from the words you decided were thrust in your mouth, they weren’t, would you say “it’s okay to beat off to children who may not exist”?
You’re outwardly expressing pedophile apologia.
What rights are you giving up?
Removed by mod
Psychologists.
There’s no evidence that CSAM, real or virtual, helps reduce rates of child predation.
Removed by mod
I’d love if you could cite your evidence.
I assume it increases it then since you’re so opposed to it
I never said I had evidence. I specifically said there was no evidence. The claim presented is that it’s beneficial, and thte burden of proof lies with the claim.
That’s not claiming it is benefical. It’s entertaining the idea that what if it is.
Then that can be decided by psychologists. It’s funny you keep insisting on calling it “CG porn” though when it’s abjectly and legally child pornography.
I have not once called it CG porn.
There’s a disgusting number of people on this site that I’ve seen defending ai pedos. I honestly don’t understand where it comes from. Some people cannot and should not be helped as their views are incompatible with society.
Not to mention that AI pedophilia could simply be creating a massive stepping stone to the real thing. Which I’ve also seen a number of people on Lemmy defend people possessing CSAM and saying they didn’t produce it therefore they aren’t the criminal. It’s pure insanity. I’m incredibly liberal and progressive and even I know that’s a slope I don’t wish to have society slip down it’s not worth the risk to children who are innocent to be caught in the crossfire.
Removed by mod
Half the answers in this cursed thread. Like wtf is this thread.
Pedophilia IS AN ADDICTION!!! Fueling it with anything, even AI will worsen the ADDICTION!
Removed by mod
What you don’t have any more arguments and so you resent to saying it’s stupid and insult?
Yours seems to be the one.
I can’t say I entirely agree. I do think that they should be helped, but in a measured and rigorous way. None of this “let them find shit online that quells their needs”. Pedophilia, in the psychological profession, is viewed in a similar light to sexual orientations; of that the person I’m responding to is correct. It’s simply that they seem to be blind to nuance beyond that stance that they’re stuck.
AI pedophilia is certainly a very risky move for us to simply accept, when we don’t even have any data on how consumption of real or virtual CSAM impacts those who indulge in it, and to get that data would require us to do very unethical and likely illegal research as far as I can tell. The approach Kerfuffle@shi.tjust.works is suggesting is one that is naive and myopic in the most generous light; which is how I’m choosing to take it so as to not accuse them of something they may not be guilty of.
I’m also someone who’s extremely progressive, and while I can sympathize with people who have these urges and no true wish to act on them, I think it’s outright malicious to say that the solution is to simply allow them to exist with informal self-treatments based on online “common sense” idealism. Mental health support should absolutely be available and encouraged; part of that is making sure people are safe to disclose this stuff to medical professionals, but no part of that is just having this shit freely spread online.
I appreciate your measured and metered response. I think these are extremely tricky conversations to have, but important, especially with how technology is progressing.
The problem is that the technology is progressing so rapidly without any checks or balances that our reaction for the time being should simply be one that enables further research without allowing others to create. This isn’t to say we should be stopping advancements, but we should be taking measured responses and using the input of psychologists to help us better understand the repercussions. It’s the same as if someone could generate AI gore that allows them to make generated videos of them killing someone they have always wanted to kill. It’s something that needs to be evaluated before we just release this stuff into the world. Specifically before this technology gets even better and more realistic. That blending of reality from fiction could be a path we as a society are not prepared for.
My worry is that people with backgrounds in computers are making decisions around things that impact human brains.
I agree completely. Unfortunately techbros have been making important world-changing decisions for two decades now and our legislators seem mostly fine to let them continue unabated.