1) Ai generates a specific and indistinguishable from a minor piece of CP and then runs a cel-shader over it — is this CP?
Ok, well, none of this is legal advice of any kind (obviously), and I recommend anyone seeking real legal advice hire an attorney and not just listen to some asshole on the internet with an Avernum profile pic. That said...
That scenario you gave is art that's created from an image of a real minor.
Generative AI art is trained off a large database of images. Some of those images may be of minors, but here's the important part: those images DO NOT EXIST inside the model. The model has merely
learned what a child is. And probably poorly at that.
It's the difference between an artist
knowing how to draw a child in general, and an artist taking a real image of a real child and then tracing it.
E.g., would you try to put a Renaissance artist in prison for drawing a cherub? No? Same thing, just will an alien-headed vampire animu girl.
2) The word “actual” in CIPA references whether an ordinary person would believe the image depicted an “actual minor” in the sense that the subject in the image is “indistinguishable from a minor” regardless of representational style — in the law it is explicit that a specific identification of unique minor is not required. Actual means “perceived to be indistinguishable from actual by an ordinary person.”
Yes, "actual". As in "real". Which this is clearly not.
Again, would you mistake that image of a 3-fingered loli showing off a snatch that consists of one line down her crotch for a real child engaging in a sexual act?
If someone drew a line on a Barbie doll pelvis and told you it was a 13yo girl, would you try to get them prosecuted for child porn?
Based on both 1 & 2, the advent of Ai means that an ordinary person cannot distinguish between any of the things under discussion, and as such, the mere invention of Ai and its deployment to public, changed the meaning of CIPA since the meaning is inherently subjective.
It was always subjective to some degree. IIRC, some states have thrown guys in prison for
text based on on obscenity laws, which are even more subjective. And they did the same to Max Hardcore, who just made porn with 18+ whores who said things like "daddy", but were clearly adults. (What prosecutor did that guy piss off btw?)
Photoshop has existed for 3 decades now, and hyperrealist art has existed for longer. The idea that some bad hands image generator changes what people can and can't distinguish is an exaggeration at the very least.
If you make something that looks like a real photo of a real child, whether you do it by hand or with a computer, then you have a problem. If you made a computer generated image based on a photo of a real child, then you have a problem.
Until you do though, someone is probably going to have to change the law or ignore precedent in order to prosecute someone for it. (Which, hey, they may well do these days.)