badbytes ,

Breaking news: Paint made illegal, cause some moron painted something stupid.

catloaf ,

Some places do lock up spray paint due to its use in graffiti, so that's not without precedent.

Soggy ,

They lock it up because it's frequently stolen. (Because of its use in graffiti, but still.)

cley_faye ,

I'd usually agree with you, but it seems he sent them to an actual minor for "reasons".

Glass0448 ,
@Glass0448@lemmy.today avatar

Asked whether more funding will be provided for the anti-paint enforcement divisions: it's such a big backlog, we'll rather just wait for somebody to piss of a politician to focus our resources.

Greg ,
@Greg@lemmy.ca avatar

This is tough, the goal should be to reduce child abuse. It's unknown if AI generated CP will increase or reduce child abuse. It will likely encourage some individuals to abuse actual children while for others it may satisfy their urges so they don't abuse children. Like everything else AI, we won't know the real impact for many years.

LadyAutumn , (edited )
@LadyAutumn@lemmy.blahaj.zone avatar

How do you think they train models to generate CSAM?

Some of yall need to lookup what an LoRA is

Dkarma ,

Lol you don't need to train it ON CSAM to generate CSAM. Get a clue.

LadyAutumn ,
@LadyAutumn@lemmy.blahaj.zone avatar

It should be illegal either way, to be clear. But you think theyre not training models on CSAM? Youre trusting in the morality/ethics of people creating AI generated child pornography?

Greg ,
@Greg@lemmy.ca avatar

The use of CSAM in training generative AI models is an issue no matter how these models are being used.

L_Acacia ,

The training doesn't use csam, 0% chance big tech would use that in their dataset. The models are somewhat able to link concept like red and car, even if it had never seen a red car before.

AdrianTheFrog ,
@AdrianTheFrog@lemmy.world avatar

Well, with models like SD at least, the datasets are large enough and the employees are few enough that it is impossible to have a human filter every image. They scrape them from the web and try to filter with AI, but there is still a chance of bad images getting through. This is why most companies install filters after the model as well as in the training process.

DarkThoughts ,

You make it sound like it is so easy to even find such content on the www. The point is, they do not need to be trained on such material. They are trained on regular kids, so they know their sizes, faces, etc. They're trained on nude bodies, so they also know how hairless genitals or flat chests look like. You don't need to specifically train a model on nude children to generate nude children.

barsquid ,

https://purl.stanford.edu/kh752sm9123

I don't know if we can say for certain it needs to be in the dataset, but I do wonder how many of the other models used to create CSAM are also trained on CSAM.

DarkThoughts ,

I suggest you actually download stable diffusion and try for yourself because it's clear that you don't have any clue what you're talking about. You can already make tiny people, shaved, genitals, flat chests, child like faces, etc. etc. It's all already there. Literally no need for any LoRAs or very specifically trained models.

SeattleRain , (edited )

America has some of the most militant anti pedophilic culture in the world but they far and away have the highest rates of child sexual assault.

I think AI is going to revel is how deeply hypocritical Americans are on this issue. You have gigantic institutions like churches committing industrial scale victimization yet you won't find a 1/10th of the righteous indignation against other organized religions where there is just as much evidence it is happening as you will regarding one person producing images that don't actually hurt anyone.

It's pretty clear by how staggering a rate of child abuse that occurs in the states that Americans are just using child victims as weaponized politicalization (it's next to impossible to convincingly fight off pedo accusations if you're being mobbed) and aren't actually interested in fighting pedophilia.

kandoh ,

Most states will let grown men marry children as young as 14. There is a special carve out for Christian pedophiles.

ricecake ,

Fortunately most instances are in the category of a 17 year old to an 18 year old, and require parental consent and some manner of judicial approval, but the rates of "not that" are still much higher than one would want.
~300k in a 20 year window total, 74% of the older partner being 20 or younger, and 95% of the younger partner being 16 or 17, with only 14% accounting for both partners being under 18.

There's still no reason for it in any case, and I'm glad to live in one of the states that said "nah, never needed .

Ibaudia ,
@Ibaudia@lemmy.world avatar

Isn't there evidence that as artificial CSAM is made more available, the actual amount of abuse is reduced? I would research this but I'm at work.

WILSOOON ,

Fuckin good job

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • kbinchat
  • All magazines