The argument used to be that this content was illegal because it was produced through the victimization of a child.
Labeling it CSAM when there is no CSA is misleading.
Obviously I’m not condoning this crap, I’m more against having your sense of disgust manipulated so that the government can get a toehold on banning types of speech that they label offensive.
That’s not even the issue, but sextortion: https://www.justice.gov/usao-sdin/pr/fbi-and-partners-issue-national-public-safety-alert-sextortion-schemes
Kids and their parents are being extorted with this crap.
That’s not the issue that I was addressing about.
The issue, in the article, that was decided in the appeal was:
While the court agreed to dismiss the possession charge against Anderegg, it declined to extend Stanley to the production of obscene AI-CSAM. Stanley, the court said, was focused on possession, with no mention of production, and the Supreme Court hasn’t seen fit to recognize protection for production of obscenity in the intervening 55 years.
In the case, the person was charged with sending generated images to a teenager online. This wasn’t an extortion case.
From the article:
Osborne, the court said, is not on point, as this case did not involve real children; rather, it’s more like Stanley, which “relies on the importance of freedom of thought and the sanctity of the home.”
The issue at contention was the fact that a prior case, Ashcroft v. Free Speech Coalition, 535 U.S. 234 (2002) had ruled that generated images are protected under the 1st Amendment.
In this case, the court distinguished the instant case with Free Speech Coalition by saying that the first amendment protections only applied to possession.
The court dismissed the possession charge:
The court agreed with Anderegg, rejecting the government’s arguments to the contrary. The government asserted that the case was more like Osborne than Stanley, that Stanley is limited to obscene material depicting adults, and that Congress has compelling interests to ban possession of obscene “virtual” CSAM – for example, its potential use to groom children and the difficulty of distinguishing “virtual” CSAM from imagery involving real children. The court rejected these arguments as inconsistent with Free Speech Coalition, where, it noted, the government had unsuccessfully raised basically the same arguments
But, this is the part I’m speaking of, declined to dismiss the production charge:
While the court agreed to dismiss the possession charge against Anderegg, it declined to extend Stanley to the production of obscene AI-CSAM. Stanley, the court said, was focused on possession, with no mention of production, and the Supreme Court hasn’t seen fit to recognize protection for production of obscenity in the intervening 55 years. In addition, the court declined to dismiss the Section 1466A distribution charge, as well as the charge for transferring the images to a minor.
The last part is bolded because none of this means he’s getting away. He’s still going to trial on criminal charges.
The judge ruled that existing case law means that possession of generated CSAM is legal but producing or distribution isn’t.
The government’s position is what I was taking issue with, their position is:
“Yeah we agree that it isn’t real and isn’t produced by harming children, but it’s really hard for us to prove if an image is real or not so just treat generated images like it’s actually produced by harming children. It would be much easier to convict people if we can avoid having to prove that the images are actually the damaging thing that we claim as the entire basis for trying to put this person in prison.”
The court ruled that they can’t charge a person with posession, but they can charge them with production instead. That completely fails to address the governments position and only requires that they file the charges under a different statute.
With videos of people you can determine their age based on hard factual data, like birth certificates and so there are facts that determine if the video in question was made of a person when they were underage or not.
Generated images have none of that and the government is saying that this doesn’t matter, having to prove the age of the ‘victim’ or if they even exist is simply too heavy of a burden. They’d much rather have the power to charge people on vibes and let the whims of the jury decide how old a generated girl is really.
It’s just bad, it gives the state prosecution way too much discretion to charge a person for generated images.
apparently too: https://lazysoci.al/post/23374468