At least with Google's Pixel phones you can flash your own OS like Graphene and totally de-Google it. I just hate that they don't support SD cards, that's absolute bullshit/greed on Google's part. Same with Apple.
I wouldn't be surprised if the ban was a pretext and the sub was just something admins found objectionable for their own reasons. Like as long as mods remove material and users when an issue is brought to their attention then the sub should be fine.
The fact they don't know why it happened is telling that they weren't given a real chance to correct the issue. Just centralised social media things I guess.
I feel like the real reason would be that Reddit suits know that Reddit is stereotyped as a gooner website and don't want people to think redditors are gooners, which is very wishful thinking as everyone already knows they are some of the biggest gooners out there online.
I believe the admins do get paid. It's the mods that were fucked over here that don't get paid. I was really talking about your overall contribution to humanity.
Reddit is notorious to responding to financial incentives. In the past they would ban communities only when they became toxic to advertisers due to overwhelming negative publicity. During those purges, they would often throw in some leftist subs to prevent the user-base political average from shifting leftward, but the purges were never proactive.
I think we've entered a new era where Reddit is no longer as concerned about which subs may scare advertisers, and are more concerned about which subs generate the kind of content that is valuable to LLM training. If I were training the next version of ChatGPT, I would be alarmed if a text prompt spontaneously invited me to masturbate with it, or prompts for images of a "battle station" resulted in walls of women having sex.
For almost 20 years, Google's phones have been pretty friendly to ROM flashing. I suspect that so long as Apple and Samsung have such a large market share, that will continue.
Sure, they take years to ban things like r/jailbait and all those fascist subs, but people with rooms to jerk off in a step too far.
I have zero interest one way or another in people who have the resources to devote a space to this, or the ego to want to share that space. It's a completely foreign interest to me. But to ban people on the internet for being sexual in a way you don't understand is equally beyond me.
It's worth mentioning that in this instance the guy did send porn to a minor. This isn't exactly a cut and dry, "guy used stable diffusion wrong" case. He was distributing it and grooming a kid.
The major concern to me, is that there isn't really any guidance from the FBI on what you can and can't do, which may lead to some big issues.
For example, websites like novelai make a business out of providing pornographic, anime-style image generation. The models they use deliberately tuned to provide abstract, "artistic" styles, but they can generate semi realistic images.
Now, let's say a criminal group uses novelai to produce CSAM of real people via the inpainting tools. Let's say the FBI cast a wide net and begins surveillance of novelai's userbase.
Is every person who goes on there and types, "Loli" or "Anya from spy x family, realistic, NSFW" (that's an underaged character) going to get a letter in the mail from the FBI? I feel like it's within the realm of possibility. What about "teen girls gone wild, NSFW?" Or "young man, no facial body hair, naked, NSFW?"
This is NOT a good scenario, imo. The systems used to produce harmful images being the same systems used to produce benign or borderline images. It's a dangerous mix, and throws the whole enterprise into question.
Is every person who goes on there and types, "Loli" or "Anya from spy x family, realistic, NSFW" (that's an underaged character) going to get a letter in the mail from the FBI?
I'll throw that baby out with the bathwater to be honest.
Simulated crimes aren't crimes. Would you arrest every couple that finds health ways to simulate rape fetishes? Would you arrest every person who watches Fast and The Furious or The Godfather?
If no one is being hurt, if no real CSAM is being fed into the model, if no pornographic images are being sent to minors, it shouldn't be a crime. Just because it makes you uncomfortable, don't make it immoral.
Or, ya know, everyone who ever wanted to decapitate those stupid fucking Skyrim children. Crime requires damaged parties, and with this (idealized case, not the specific one in the article) there is none.
If they were, any one who's played games is fucked. I'm confident everyone who has played went on a total ramapage murdering the townfolk, pillaging their houses and blowing everything up....in Minecraft.
People have only gotten in trouble for that when they're already in trouble for real CSAM. I'm not terrible interested in sticking up for actual CSAM scum.
No it's immoral because they are sexual gratifying themselves to pictures that look like children. Sexually desiring children or wanting to see them abused is immoral, full stop.
For now, if you read the article, it states that he shared the pictures to form like minded groups where they got emboldened and could support each other and legitimize/normalize their perverted thoughts. How about no thanks.
wrong comment chain. people weren't talking about the criminal shithead the article is about but about the scenario of someone using (not csam trained) models to create questionable content (thus it is implied that there would be no victim).
we all know that there are bad actors out there, just like there are rapists and murderers. still we dont condemn true crime lovers or rape fetishists until they commit a crime. we could do the same with pedos but somehow we believe hating them into the shadows will stop them somehow from doing criminal stuff?
And I'm using the article as an example of that it doesn't just stop at "victimless" images, because they are not fucking normal people. They are mentally sick, they are sexually turned on by the abuse of a minor, not by the minor but by abusing the minor, sexually.
In what world would a person like that stop at looking at images, they actively search for victims, create groups where they share and discuss abusing minors.
Yes dude, they are fucking dangerous bro, life is not fair. You wouldn't say the same shit if some one close to you was a victim.
What do you mean focus your energy, how much energy do you think I spend on discussing perverts? And what should I spend my time discussing contact sports. It's sound like you are deflecting.
Pedophiles get turned on abusing minors, they are mentally sick. It's not like its a normal sexual desire, they will never stop at watching "victimless" images. Fuck pedophiles they don't deserve shit, and hope they eat shit he rest of their lives.
they will never stop at watching “victimless” images.
How is that different from any other dangerous fetish? Should we be arresting adult couples that do Age Play? All the BDSM communities? Do we even want to bring up the Vore art communities? Victimless is victimless.
No because its two consenting adults otherwise its illegal. Wtf is vore art, not going to google that. How do you know it's victimless. Like I said they are turned on by abusing minors, and don't know how else I can put it, I can't be more clear.
Let me ask you this, do you think pedophiles care about their victims? If yes, then I want to hear why you think that. If no, why are we even having this argument?
I haven't given you a ultimatum I gave you a question, and you can answer it anyway you want. Do or can pedophiles feel remorse for their victims? Are there pedophiles who feel remorse for their victims but still abuse children?
But let me say this again, pedophiles have no remorse towards their victims, they get turned on by it, I'm trying to tell you it's not a just a sexual desire. They like the abuse part of it, abusing some one helpless, that is why they are turned on by abusing children.
Bro I can't continue this, you're not willing to understand it's not about the kid, it is about abusing the kid, that is what they want. And if you can't rigster that it's not just a sexual desire, then we can agree to disagree.
You're correct, pedophilia is a mental illness. A very tragic one since there is no hope and no cure. They can't even ask for help because everyone will automatically assume they are also child molesters. Which is what you're describing, but not all child molesters are pedophiles, and most pedophiles will never become child molesters... Like you said, some people just get off on exploiting the power dynamic and aren't necessarily sexually attracted to children. Those people are the real danger.
Real children are in training data regardless of if there is csam in the data or not (which there is a high chance there is considering how they get their training data) so real children are involved
Nobody is arguing that it's moral. That's not the line for government intervention. If it was then the entire private banking system would be in prison.
They've actually issued warnings and guidance, and the law itself is pretty concise regarding what's allowed.
(8) "child pornography" means any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture, whether made or produced by electronic, mechanical, or other means, of sexually explicit conduct, where-
(A) the production of such visual depiction involves the use of a minor engaging in sexually explicit conduct;
(B) such visual depiction is a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct; or
(C) such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.
...
(11) the term "indistinguishable" used with respect to a depiction, means virtually indistinguishable, in that the depiction is such that an ordinary person viewing the depiction would conclude that the depiction is of an actual minor engaged in sexually explicit conduct. This definition does not apply to depictions that are drawings, cartoons, sculptures, or paintings depicting minors or adults.
If you're going to be doing grey area things you should do more than the five minutes of searching I did to find those honestly.
It was basically born out of a supreme Court case in the early 2000s regarding an earlier version of the law that went much further and banned anything that "appeared to be" or "was presented as" sexual content involving minors, regardless of context, and could have plausibly been used against young looking adult models, artistically significant paintings, or things like Romeo and Juliet, which are neither explicit nor vulgar but could be presented as involving child sexual activity. (Juliet's 14 and it's clearly labeled as a love story).
After the relevant provisions were struck down, a new law was passed that factored in the justices rationale and commentary about what would be acceptable and gave us our current system of "it has to have some redeeming value, or not involve actual children and plausibly not look like it involves actual children".
The major concern to me, is that there isn’t really any guidance from the FBI on what you can and can’t do, which may lead to some big issues.
The Protect Act of 2003 means that any artistic depiction of CSAM is illegal. The guidance is pretty clear, FBI is gonna raid your house.....eventually. We still haven't properly funded the anti-CSAM departments.
I miss r/Coomer though, it existed to mock and insult people who have a very unhealthy habit of masturbating and sexual addiction (particularly to lolicon). Like, fuck, sex and masturbating isn't bad on it's own but it shouldn't be your god damn identity.
I miss r/Coomer though, it existed to mock and insult people who have a very unhealthy habit of masturbating and sexual addiction (particularly to lolicon). Like, fuck, sex and masturbating isn't bad on it's own but it shouldn't be your god damn identity.
You missing a community dedicated to mock & harass people who probably need actual help with coping really says a lot about you.
Ooooo, I'm so not impressed that your only breathing existence is to dig through people's post history to connect something insulting. Get a life, dude.
@snownyte
Ooooo, I'm so not impressed that your only breathing existence is to dig through people's post history to connect something insulting. Get a life, dude.
The only thing I dug through was the comment I replied to.
But thanks for proving my point! lmao
I had to websearch the term to know what they were talking about. Pfft, internet oddities.
Case in point. At least some side is lying here; either the people from r/gooncaves or the Reddit administration. And given their modus operandi I'm placing my bets on the admins lying.
Frankly, at this rate someone might end suing Reddit for libel in those ban messages. I think that it deserves it.
He then allegedly communicated with a 15-year-old boy, describing his process for creating the images, and sent him several of the AI generated images of minors through Instagram direct messages. In some of the messages, Anderegg told Instagram users that he uses Telegram to distribute AI-generated CSAM. “He actively cultivated an online community of like-minded offenders—through Instagram and Telegram—in which he could show off his obscene depictions of minors and discuss with these other offenders their shared sexual interest in children,” the court records allege. “Put differently, he used these GenAI images to attract other offenders who could normalize and validate his sexual interest in children while simultaneously fueling these offenders’ interest—and his own—in seeing minors being sexually abused.”
I think the fact that he was promoting child sexual abuse and was communicating with children and creating communities with them to distribute the content is the most damning thing, regardless of people's take on the matter.
Umm ... That AI generated hentai on the page of the same article, though ... Do the editors have any self-awareness? Reminds me of the time an admin decided the best course of action to call out CSAM was to directly link to the source.
Umm … That AI generated hentai on the page of the same article, though … Do the editors have any self-awareness? Reminds me of the time an admin decided the best course of action to call out CSAM was to directly link to the source.
Ah yes, more bait articles rising to the top of Lemmy. The guy was arrested for grooming, he was sending these images to a minor. Outside of Digg, anyone have any suggestions for an alternative to Lemmy and Reddit? Lemmy's moderation quality is shit, I think I'm starting to figure out where I lean on the success of my experimental stay with Lemmy
Edit: Oh god, I actually checked digg out after posting this and the site design makes it look like you're actually scrolling through all of the ads at the bottom of a bulshit clickbait article
Lemmy as a whole does not have moderation. Moderators on Lemmy.today cannot moderate Lemmy.world or Lemmy ml, they can only remove problematic posts as they come and as they see fit or block entire instances which is rare.
If you want stricter content rules than any of the available federated instances then you'll have to either:
Use a centralized platform like Reddit but they're going to sell you out for data profits and you'll still have to occasionally deal with shit like "The Donald."
Start your own instance with a self hosted server and create your own code of conduct and hire moderators to enforce it.
Yeah, I know, thats why I'm finding lemmy not for me. This new rage bait every week is tiring and not adding anything to my life except stress, and once I started looking at who the moderaters were when Lemmy'd find a new thing to rave about, I found that often there was 1-3 actual moderators, which, fuck that. With reddit, the shit subs were the exception, here it feels like they ALL (FEEL being a key word here) have a tendency to dive face first into rage bait
Edit: Most of the reddit migration happened because Reddit fucked over their moderators, a lot of us were happy with well moderated discussions, and if we didnt care to have moderators, we could have just stayed with reddit after the moderators were pushed away
You can go to an instance that follows your views closer and start blocking instances that post low quality content to you. Lemmy is a protocol, it's not a single community. So the moderation and post quality is going to be determined by the instance you're on and the community you're with.
This is throwing a blanket over the problem. When the mods of a news community allow bait articles to stay up because they (presumably) further their views, it should be called out as a problem.
404media.co
Active