I had to websearch the term to know what they were talking about. Pfft, internet oddities.
Case in point. At least some side is lying here; either the people from r/gooncaves or the Reddit administration. And given their modus operandi I'm placing my bets on the admins lying.
Frankly, at this rate someone might end suing Reddit for libel in those ban messages. I think that it deserves it.
This is tough, the goal should be to reduce child abuse. It's unknown if AI generated CP will increase or reduce child abuse. It will likely encourage some individuals to abuse actual children while for others it may satisfy their urges so they don't abuse children. Like everything else AI, we won't know the real impact for many years.
It should be illegal either way, to be clear. But you think theyre not training models on CSAM? Youre trusting in the morality/ethics of people creating AI generated child pornography?
The training doesn't use csam, 0% chance big tech would use that in their dataset. The models are somewhat able to link concept like red and car, even if it had never seen a red car before.
Well, with models like SD at least, the datasets are large enough and the employees are few enough that it is impossible to have a human filter every image. They scrape them from the web and try to filter with AI, but there is still a chance of bad images getting through. This is why most companies install filters after the model as well as in the training process.
You make it sound like it is so easy to even find such content on the www. The point is, they do not need to be trained on such material. They are trained on regular kids, so they know their sizes, faces, etc. They're trained on nude bodies, so they also know how hairless genitals or flat chests look like. You don't need to specifically train a model on nude children to generate nude children.
I don't know if we can say for certain it needs to be in the dataset, but I do wonder how many of the other models used to create CSAM are also trained on CSAM.
I suggest you actually download stable diffusion and try for yourself because it's clear that you don't have any clue what you're talking about. You can already make tiny people, shaved, genitals, flat chests, child like faces, etc. etc. It's all already there. Literally no need for any LoRAs or very specifically trained models.
There are a lot of people on Reddit and Reddit makes it easy to organize by whatever degenerate interests someone may have, such as sharing their goon cave
He then allegedly communicated with a 15-year-old boy, describing his process for creating the images, and sent him several of the AI generated images of minors through Instagram direct messages. In some of the messages, Anderegg told Instagram users that he uses Telegram to distribute AI-generated CSAM. “He actively cultivated an online community of like-minded offenders—through Instagram and Telegram—in which he could show off his obscene depictions of minors and discuss with these other offenders their shared sexual interest in children,” the court records allege. “Put differently, he used these GenAI images to attract other offenders who could normalize and validate his sexual interest in children while simultaneously fueling these offenders’ interest—and his own—in seeing minors being sexually abused.”
I think the fact that he was promoting child sexual abuse and was communicating with children and creating communities with them to distribute the content is the most damning thing, regardless of people's take on the matter.
Umm ... That AI generated hentai on the page of the same article, though ... Do the editors have any self-awareness? Reminds me of the time an admin decided the best course of action to call out CSAM was to directly link to the source.
Umm … That AI generated hentai on the page of the same article, though … Do the editors have any self-awareness? Reminds me of the time an admin decided the best course of action to call out CSAM was to directly link to the source.
Let me guess, they used existing footage and reports as training data, and it produced an incredibly racist ai model that routinely ignores police misconduct. They'll spend a couple years working on bandaids for the problem while police departments across the country use the original model to create reports and use the "unbiased ai reports" as an excuse to hide the raw footage.
Also: "That was just an AI hallucination, it's not admissible in court" when it was absolutely not an AI hallucination when the cop shot someone for mouthing off.
Not sure if you're aware so I'll mention it anyway, but as far as I know, downvotes in Beehaw communities don't federate to Beehaw (as in aren't applied here - you might see them on your instance though, not really sure). That being said, your comment does, so you've made a "pseudo-downvote" anyway.
The title is not mine and the paper the article is responding to was published last month, not two years ago as you claim. The only mention of Musk in the entire article is in this one sentence:
Unlike self-serving warnings from Open AI CEO Sam Altman or Elon Musk about the “existential risk” artificial general intelligence poses to humanity, Google’s research focuses on real harm that generative AI is currently causing and could get worse in the future.
And like all agitation, it transfers heat (not like a microwave, as most heat is transferred to surrounding air, but heats up the liquid, and it mostly does so by heating/getting absorbed by the dense objects, ie grains, mostly surface).
Also, in case this somehow didn't exist for decades, all of it is just a bit better way of stirring - you can make cold brew by just mixing/shaking stuff. The coarser the grind the longer it would take to extract efficiently tho (but efficiency isn't rally the point, just taste?).
Ultrasounds accelerate extraction processes due to acoustic cavitation [8], [9]. When acoustic bubbles, also called inertial bubbles, collapse near solid materials, such as coffee grounds, they generate micro-jets with the force to fracture the cell walls of plant tissues, intensifying the extraction of the intracellular content [10].
Seems more involved than just aggressive stirring.
Yes, that is exactly how ultrasonic cleaners are used, it basically gets abrasive on the surface (like scrubbing with like a hammering motion, but on a tiny level).
The size of the bubbles is determined by the frequency (the higher it is the smaller the bubbles with lower energy each, the lower it is the bigger are bubbles and more powerful each).
So, if you are cleaning a large flat metal sheet, then you can go lower frequencies to speed up the process, whereas you would want higher frequencies for more intricate objects so the amplitude is smol enough to get into all the tight spaces for bubbles to form.
38kHz is a very common for ultra cheap household various purpose cleaner (jewellery, fruit & veggies, glasses, delicate clothes, etc), I have a 50kHz buttplug shaped one (so you put in a container and is not itself part of one).
That's not exactly what it's doing. Cavitation is when when the pressure of a liquid reduces below the vapour point. Heat isn't involved the liquid "boils" because the vapour point decreases with reduced pressure.
🤷♂️ Tomato / Potato. Cavitation occurs (the bubble formation) at a temperature below 100C, yes. As the steam bubble shrinks, very high temperatures are reached (super-heated steam). All of that energy, plus the latent heat of condensation is released back into the fluid. At that instant, there is a very small yet-to-be-mixed portion of liquid that may be near the boiling point. That small portion of fluid may undergo a warm-brew process as it cools and mixes. I'm kind of conceptualizing this brewing process like: what if you could heat, mix, and cool the coffee all at once everywhere. But I've never observed cavitation and bubble collapse with an ultra high-speed microscope camera, so my concept may be off a bit. I have seen photos of what it does to hardened steel hydropower turbines.
My next question would be, what if you start with ice water? That may give you something like true cold-brew. Another factor to consider is that I believe most cold brew is very oxidized. It might be interesting to try ultrasonic degassing for some period of time before the grounds are added, to see how much of the cold brew flavor is just oxidized coffee.
I agree this is the kind of thing I should find on YouTube, not in an academic journal. But the paper does go into a lot of detail about extraction efficiency, so I guess there might be some useful measurements.
I am curious about the taste. It should be somewhere in between cold brew and hot, but probably closer to cold. Cavitation is a violent process. On a micro scale it’s literally boiling. Then the steam bubble collapses and is instantly cooled because of an almost infinitely big heat sink. So when cavitation occurs near the coffee grounds, some of the extraction would be at much warmer temperatures, for a brief instant.
Oh, yes, I was making fun of the headline, about inventing.
With that in mind basically any experiment/measurement/scientific theory is some sort of invention, it's just that we dont call it that.
Like, nobody invented the concept of tank, ppl "invented" materials, equipment, manufacturing & logistics/admin processes, etc that at one point allowed for a feasible "tank" to be compiled.
The "thousands of people" watching your "stream" are bots. They can respond to what's going on in the video in real time because they're bots. Actually I technically think this would be more efficient and therefore is probably designed so that it's only one LLM pretending to be thousands of people, but I'll call it bots because that's easier to visualise. The bots know what's going on in the "stream" because they can understand what the "streamer" is saying, which means the pickup artist can put on a convincing performance to trick the mark. If it was just a recording, it wouldn't be able to respond to novel situations caused by the mark's behaviour.
I don't actually know if this technology even works, but that would be the intent used to sell it to pickup artist bros.
I had an idea when these first AI image generators started gaining traction. Flood the CSAM market with AI generated images( good enough that you can't tell them apart.) In theory this would put the actual creators of CSAM out of business, thus saving a lot of children from the trauma.
Most people down vote the idea on their gut reaction tho.
My concern is why would it put them out of business? If we just look at legal porn there is already beyond huge amounts already created, and the market is still there for new content to be created constantly. AI porn hasn't noticeably decreased the amount produced.
Really flooding the market with CSAM makes it easier to consume and may end up INCREASING the amount of people trying to get CSAM. That could end up encouraging more to be produced.
It's such an emotional topic that people lose all rationale.
I remember the Reddit arguments in the comment sections about pedos, already equalizing the term with actual child rapists, while others would argue to differentiate because the former didn't do anything wrong and shouldn't be stigmatized for what's going on in their heads but rather offered help to cope with it. The replies are typically accusations of those people making excuses for actual sexual abusers.
I always had the standpoint that I do not really care about people's fictional content. Be it lolis, torture, gore, or whatever other weird shit. If people are busy & getting their kicks from fictional stuff then I see that as better than using actual real life material, or even getting some hands on experiences, which all would involve actual real victims.
And I think that should be generally the goal here, no? Be it pedos, sadists, sociopaths, whatever. In the end it should be not about them, but saving potential victims. But people rather throw around accusations and become all hysterical to paint themselves sitting on their moral high horse (ironically typically also calling for things like executions or castrations).
Asked whether more funding will be provided for the anti-paint enforcement divisions: it's such a big backlog, we'll rather just wait for somebody to piss of a politician to focus our resources.
Well yeah. Just because something makes you really uncomfortable doesn't make it a crime. A crime has a victim.
Also, the vast majority of children are victimized because of the US' culture of authoritarianism and religious fundamentalism. That's why far and away children are victimized by either a relative or in a church. But y'all ain't ready to have that conversation.
It's not whataboutism, he's being persecuted because of the idea that he's hurting children all the while law enforcement refuses to truly persecute actual institutions victimizing children and are often colluding with traffickers. For instance LE throughout the country were well aware of the scale of the Catholic church's crimes for generations.
They're not two different things. They're both supposedly acts of pedophilia except one would take actual courage to prosecute (churches) and the other which doesn't have any actual victims is easy and is a PR get because certain people find it really icky.
First of all, it's absolutely crazy to link to a 6 month old thread just to complain that you go downvoted in it. You're pretty clearly letting this site get under your skin if you're still hanging onto these downvotes.
Second, none of your 6 responses in that thread are logical, rational responses. You basically just assert that things that you find offensive enough should be illegal, and then just type in all caps at everyone who explains to you that this isn't good logic.
The only way we can consider child porn prohibition constitutional is to interpret it as a protection of victims. Since both the production and distribution of child porn hurt the children forced into it, we ban it outright, not because it is obscene, but because it does real damage. This fits the logic of many other forms of non-protected speech, such as the classic "shouting 'fire' in a crowded theatre" example, where those hurt in the inevitable panic are victims.
Expanding the definition of child porn to include fully fictitious depictions, such as lolicon or AI porn, betrays this logic because there are no actual victims. This prohibition is rooted entirely in the perceived obscenity of the material, which is completely unconstitutional. We should never ban something because it is offensive, we should only ban it when it does real harm to actual victims.
I would argue that rape and snuff film should be illegal for the same reason.
The reason people disagree with you so strongly isn't because they think AI generated pedo content is "art" in the sense that we appreciate it and defend it. We just strongly oppose your insistence that we should enforce obscenity laws. This logic is the same logic used as a cudgel against many other issues, including LGBTQ rights, as it basically argues that sexually disagreeable ideas should be treated as a criminal issue.
I think we all agree that AI pedo content is gross, and the people who make it and consume it are sick. But nobody is with you on the idea that drawings and computer renderings should land anyone in prison.
First of all, it's absolutely crazy to link to a 6 month old thread just to complain that you go downvoted in it. You're pretty clearly letting this site get under your skin if you're still hanging onto these downvotes.
No, I just... Remembered the thread? Wasn't difficult to remember it. Took me a minute to find it.
This may surprise you but CP isn't something I discuss very often.
I don't lose sleep over people defending CP as "art", nor did it get under my skin. I just think these are fucking idiots and are for some baffling reason trying to defend the indefensible and go about my day. I'm not going to do anything about it, but I'm sure glad I don't have such dumb comments linked to a public account with my IP address logged somewhere...
I just raised it to make my point.
I didn't bother reading the rest of your essay. Its pretty clear from the first paragraph where you're going to land.
404media.co
Top