"There are no restrictions on fictional adult sexual content with dark …"
-
"There are no restrictions on fictional adult sexual content with dark …"
Well, if you're in the UK you should delete X from all your devices because you are ONE received tweet or DM away from committing a strict liability offense under Section 63 of the Criminal Justice and Immigration Act (2008) (as amended 2015), carrying a 2-3 year prison sentence.
(Fucking American techbro morons think their local laws apply everywhere.)
https://en.wikipedia.org/wiki/Section_63_of_the_Criminal_Justice_and_Immigration_Act_2008
https://bsky.brid.gy/r/https://bsky.app/profile/did:plc:wt5cnjsnwwb7mxfac5of2pdo/post/3mbpltq3mlk26 -
undefined oblomov@sociale.network shared this topic on
-
"There are no restrictions on fictional adult sexual content with dark …"
Well, if you're in the UK you should delete X from all your devices because you are ONE received tweet or DM away from committing a strict liability offense under Section 63 of the Criminal Justice and Immigration Act (2008) (as amended 2015), carrying a 2-3 year prison sentence.
(Fucking American techbro morons think their local laws apply everywhere.)
https://en.wikipedia.org/wiki/Section_63_of_the_Criminal_Justice_and_Immigration_Act_2008
https://bsky.brid.gy/r/https://bsky.app/profile/did:plc:wt5cnjsnwwb7mxfac5of2pdo/post/3mbpltq3mlk26@cstross similar here in Spain, when our CSAM law was updated to add "ownership" and not just distribution as it was earlier (ownership was still illegal but it wasn't clear enough in the law), and include fictional depictions, the attorney general at the time was quick at clarifying that no it doesn't apply to drawings of Bart and Lisa Simpson or manga you sick assholes, it's intended for depictions intended to be realistic like photographic looking 3d renders etc. I think it's now being updated to include deepfakes and other AI shit but the old one would still apply.
-
@cstross similar here in Spain, when our CSAM law was updated to add "ownership" and not just distribution as it was earlier (ownership was still illegal but it wasn't clear enough in the law), and include fictional depictions, the attorney general at the time was quick at clarifying that no it doesn't apply to drawings of Bart and Lisa Simpson or manga you sick assholes, it's intended for depictions intended to be realistic like photographic looking 3d renders etc. I think it's now being updated to include deepfakes and other AI shit but the old one would still apply.
@cygnathreadbare @cstross Big Problem: AI is very good at manga -> photorealism.
-
@cygnathreadbare @cstross Big Problem: AI is very good at manga -> photorealism.
@KarlHeinzHasliP @cygnathreadbare @cstross
That's only a problem if you're using it that way, the problem is that it's good at the other way.
The reason CSAM is banned is that we, as a society, think that sexually abusing children is bad and we know enough about markets to understand that demand for this material will cause people to produce it.
The reason for banning the fake kind is more complex. The bans on fictional images (ones that weren't created with child abuse) are because it's easy to get caught with CSAM and say 'oh, these aren't real photos, they were made by this great artist / tool / whatever'. And then the prosecutors need to trace the provenance and prove that, no, really, children were abused to create this.
The fact that you can do manga -> photorealistic transitions is not a problem, it's just another way that you can generate illegal material. If you do, and you are caught, there are already legal penalties. And the simple solution to this is: don't.
The problem is that people can take things that were produced by abusing real children and run the model in the other direction to get manga. And then they can claim that it was drawn by a human artist and no children were actually harmed. And now we're back in the same situation that we were with photorealistic child-abuse 'art'. And that may lead to the Spanish decision being reversed. And you can then run a model in the manga -> photorealistic direction on demand, leaving no trace of the fact that you were looking at something that's close to the original image.
'We built a machine for laundering CSAM' is very on-brand for 2026 techbros.
-
@KarlHeinzHasliP @cygnathreadbare @cstross
That's only a problem if you're using it that way, the problem is that it's good at the other way.
The reason CSAM is banned is that we, as a society, think that sexually abusing children is bad and we know enough about markets to understand that demand for this material will cause people to produce it.
The reason for banning the fake kind is more complex. The bans on fictional images (ones that weren't created with child abuse) are because it's easy to get caught with CSAM and say 'oh, these aren't real photos, they were made by this great artist / tool / whatever'. And then the prosecutors need to trace the provenance and prove that, no, really, children were abused to create this.
The fact that you can do manga -> photorealistic transitions is not a problem, it's just another way that you can generate illegal material. If you do, and you are caught, there are already legal penalties. And the simple solution to this is: don't.
The problem is that people can take things that were produced by abusing real children and run the model in the other direction to get manga. And then they can claim that it was drawn by a human artist and no children were actually harmed. And now we're back in the same situation that we were with photorealistic child-abuse 'art'. And that may lead to the Spanish decision being reversed. And you can then run a model in the manga -> photorealistic direction on demand, leaving no trace of the fact that you were looking at something that's close to the original image.
'We built a machine for laundering CSAM' is very on-brand for 2026 techbros.
@david_chisnall @KarlHeinzHasliP @cygnathreadbare @cstross Even more on-brand for 2026 techbros would be “We built a machine for laundering CSAM, and now we will explain why that's good, actually.”