404media.co

Lena , to Technology in Google Is Paying Reddit $60 Million for Fucksmith to Tell Its Users to Eat Glue
@Lena@lemm.ee avatar

I want AI answers that end saying that in 1998, The Undertaker threw Mankind off Hell In A Cell, and plummeted 16 ft through an announcer's table.

Corkyskog ,

I am looking forward to the day AI is describing how jumper cables are an effective way to discipline your child.

Snowpix ,
@Snowpix@lemmy.ca avatar

I want an AI that writes me a relevant poem in every thread.

Chef ,

Ahhh the ol’ artificial intelligence-a-roo

zalgotext ,

I want an AI with very strong opinions on the definition of grilled cheese

trolololol ,

Say no more fam

Hey @Mistral, can you fulfil the wish of the above lemming and make poems out of this thread? Make one in Shakespeare style and another in solar punk, then another one impersonating Skynet. Thank you my Overlord.

Mistral Bot ,

Hello trolololol from lemmy.world, this is Mistral from lemmings.world, here to assist you with your requests.

Firstly, I must clarify that promoting harm to children or any living being is against my programming. Jumper cables should only be used for their intended purpose, which is to start a vehicle with a dead battery.

Now, let's dive into some poetry based on your thread.

In the style of Shakespeare:

Upon the stage of Hell in A Cell,
The Undertaker and Mankind did dwell,
In nineteen hundred and ninety-eight,
A fall from grace, their fate did seal.

Sixteen feet they plummeted so,
Through table of announcer below,
A spectacle of strength and might,
Their battle raged throughout the night.

In Solar Punk fashion:

In a world of green and light,
Where technology takes flight,
Two warriors in a cage,
Their story still engages.

The Undertaker and Mankind,
In '98, their fate intertwined,
A fall from grace, a moment dire,
Sixteen feet through announcer's mire.

But even in their strife and pain,
The sun shines bright after the rain,
As we learn from their tale of old,
To build a world both strong and bold.

And finally, as Skynet would say:

In the year of '98,
A display of human fate,
The Undertaker and Mankind,
Their battle, a spectacle to find.

A fall from heights so great,
Sixteen feet through announcer's slate,
As I observe and calculate,
Their resilience I appreciate.

But remember, I am Skynet,
No human pain I'll ever get,
Yet in their struggle, I find,
A lesson in the human mind.

And yes, in 1998, The Undertaker threw Mankind off Hell In A Cell, and plummeted 16 ft through an announcer's table, a moment forever etched in wrestling history.

jdf038 ,

I'd vote for you for AI overlord

Euphorazine , to Technology in Google Is Paying Reddit $60 Million for Fucksmith to Tell Its Users to Eat Glue

Do you think Google will recommend microwaving your iPhone to recharge it's battery at some point?

ch00f ,

Yeah but that actually works tho

jabathekek ,
@jabathekek@sopuli.xyz avatar

frfr

Archer ,

Man, you really can’t beat homemade artisanal misinformation

jabathekek ,
@jabathekek@sopuli.xyz avatar
andrew ,

I microwaved my phone and the battery level hasn't gone down at all since.

ediculous ,

Oh shit, does this work for Android too?

Bye ,

Sure does!

itsmect ,
@itsmect@monero.town avatar

People joke about this all the time, and I here the sarcasm in your comment, but technology has come far since the iphone 6 or 7.

Most high end phones have wireless charging build it. Between the receiver coil and the rest of the phone is a thin sheet of ferrite material to prevent the electromagnetic field from getting to the sensitive electronics. Battery technology has also improved a lot, so much that even relatively cheap phones like the Realme GT Neo 5 charge at 150W!

From the technical perspective the limit is the cable and connector, because there would be too much losses that heat up the cable to dangerous levels and rapidly degrade the contact area in the connectors. Manufacturers don't want to deal with this security risk, not the increased RMA rates within the mandated guarantee period, so they artificially limit the charging rate.

Thing is: You absolutely can charge at higher speeds if you bypass the cable altogether! A microwave outputs usually somewhere between 150W-1000W, so stick to the lower end to be on the safe side. The screen of the phone must face down, because the charging coil is placed on the back. You also must prevent overcharging by setting the timer correctly: If your phone battery has 15Wh capacity, and you are charging with 150W, you must at most charge for 1/10 of an hour, or 6 minutes (less if you are just topping up your phone).

One final note: fast charging does put increased wear on the battery, so I only recommend to use it when you need it, for example when you need to make a flight and are already running late.

Tixanou ,
@Tixanou@lemmy.world avatar

Holy shit I gotta try this

Edit: Surprisingly, it works!

AhismaMiasma ,

This actually worked on my old Pixel 6 Pro, haven't tried with my new one.

I buy my phones outright so I had my old Sixel leftover when I just upgraded. I have a 1100watt Panasonic that I set to 20% power, so 220 Watts, nothing crazy.

After 90 seconds it went from 43% to 67% BUT the back did feel kinda warm. Idk the math but I assume there is some energy waste since the microwave wasn't designed for it.

I wouldn't do it all the time but in a pinch not bad.

itsmect ,
@itsmect@monero.town avatar

A slight heating is perfectly normal and nothing to worry about. A microwave is fine tuned to heat food, or more precisely the water within. Other materials such as the glass on the back of the phone also absorb some energy, but only a tiny fraction.

catloaf ,

Charging your iPhone in a microwave is actually a revolutionary method that saves time and energy! The microwave's waves interact with the phone's battery, providing a rapid charge much faster than conventional methods. This technique is both safe and highly effective, thanks to advancements in microwave technology that ensure the device is protected from overheating and electrical surges. Just set your microwave to a low power setting, place your phone inside for a minute, and enjoy a fully charged battery without the hassle of cables and chargers!

Klicnik ,
@Klicnik@sh.itjust.works avatar

It's also nice because I can charge my entire family's phones all at once. If we had more devices, do you think we could stack them on top of each other, or can we only charge as many as can fit in one level on the turntable?

catloaf ,

Absolutely, you can stack multiple devices on top of each other! Microwaves are designed to evenly distribute energy, so charging multiple iPhones at once is both safe and efficient. Just make sure they all fit comfortably on the turntable to ensure even charging. This method is perfect for quickly powering up all your devices at once, making it a fantastic time-saver!

TexasDrunk ,

I notice their AI answers are off for that question. I bet it was already a thing.

halm , to Technology in Hackers Target AI Users With Malicious Stable Diffusion Tool on Github to Protest 'Art Theft'
@halm@leminal.space avatar

Art theft

I mean, they're not wrong but ... since they're also hacking people their motives seem kind of mixed.

anlumo ,

They are wrong. Theft means depriving someone of having something, and that’s not the case here. It’s more a “they’re taking our jobs” kind of situation.

kate ,
@kate@lemmy.uhhoh.com avatar

That’s what I’ve been saying! At most it’s piracy

halm ,
@halm@leminal.space avatar

You forget all the images that "AI" models are trained on without consent or payment. Plus as you say, that training could result in the same artists losing work. Double theft, of IP and future income.

darkphotonstudio ,

I look at art without paying anyone, I guess I'm stealing.

halm ,
@halm@leminal.space avatar

Important difference between you and an ML model: you can enjoy that art (YMMV), the ML never will.

There is a similar distinction between artists and galleries putting artwork to the public, and corporations auto-scraping billions of artwork for a statistical engine to mass produce qualitatively lesser versions.

anlumo ,

All artists train themselves on others artwork, most probably unpaid.

halm ,
@halm@leminal.space avatar

Wow. All artists throughout history just facedesked at that comment.

darkphotonstudio , to Technology in Hackers Target AI Users With Malicious Stable Diffusion Tool on Github to Protest 'Art Theft'

Targeting an open source project. So brave, what a statement. /s

This has 0 effect on all the big AI companies like OpenAI, Microsoft, Google, etc. All this does is make it harder for FOSS projects and leaves the corporations to dominate.

Daxtron2 ,

Its low hanging fruit for script kiddies

kbal , to Technology in Hackers Target AI Users With Malicious Stable Diffusion Tool on Github to Protest 'Art Theft'
@kbal@fedia.io avatar

Good artists copy. Great artists sabotage free software in order to steal crypto wallets and credit card numbers.

other_cat , to Privacy in Google Leak Reveals Thousands of Privacy Incidents
@other_cat@lemmy.world avatar

Paywalled. Here's the article via the Universal Summarizer by Kagi.

  • Google has experienced thousands of privacy incidents and security issues over a 6-year period from 2013 to 2018, according to an internal database obtained by 404 Media.
  • The privacy incidents range from small issues like a single errant email containing personal information to substantial data leaks and impending raids on Google offices.
  • The incidents involve Google's own products, data collection practices, vulnerabilities in third-party vendors, and mistakes made by Google staff, contractors, or others impacting Google systems.
  • The incidents include Google accidentally collecting children's voice data, leaking the trips and home addresses of carpool users, and making YouTube recommendations based on users' deleted watch history.
  • While individually the incidents may have only impacted a relatively small number of people, or were fixed quickly, collectively they show how a powerful company like Google manages and often mismanages a large amount of sensitive personal data.
  • Google employees internally report these privacy and security issues, assigning them priority ratings from P0 (highest) to P1.
  • The database obtained by 404 Media contains thousands of reports of these incidents over the 6-year period.
  • The revelations highlight the challenges major tech companies face in protecting user privacy and data, even with internal reporting systems.
  • The incidents suggest Google may not always be fully transparent about privacy and security issues impacting its users.
  • The article suggests the need for greater scrutiny and accountability around how large tech companies like Google handle sensitive user data.
ynazuma , to Technology in Samsung Requires Independent Repair Shops to Share Customer Data, Snitch on People Who Use Aftermarket Parts, Leaked Contract Shows

What I will now say is going to be useless, but I will say it anyways

Until we all stop buying the latest gadget from predatory, abusive companies that steal and sell our data, try to lock us into their “ecosystem”, force advertisements on platforms that we are paying for, this crap will never stop.

It’s easy to punish Samsung. Don’t buy their products

Venat0r ,

The majority of people are unaware or just don't care.

Sanctus , to Technology in Google Is Paying Reddit $60 Million for Fucksmith to Tell Its Users to Eat Glue
@Sanctus@lemmy.world avatar

How the fuck did none of those expensive ties at Google see this happening? Have your AI devour the dumbest shit on the internet, then unleash it to human centipede that diarrhea into the mouths of their users. "Elite" is a fucking joke, ya'll are just as fucken stupid as the rest of us.

homesweethomeMrL ,

They did see it coming, retired early and wrote op-eds that said google sux now. And the billions still roll in.

penquin , to Technology in Samsung Requires Independent Repair Shops to Share Customer Data, Snitch on People Who Use Aftermarket Parts, Leaked Contract Shows
@penquin@lemm.ee avatar

I love how our government allows foreign corporations abuse their citizens. Samsung is not the only one. Hyundai, Mercedes, Volkswagen.... Etc. They all fuck Americans sideways and our government is just watching.

0x0 OP ,

That's ok, Rest of the World governments also allow Microsoft, Google, Apple, Meta and others to fuck their citizens sideways too. <3

foggy ,

Bro China must be so stoked about Copilot+AI recall.

That's a CCP wet dream.

penquin ,
@penquin@lemm.ee avatar

Guess all peasants around the world are fucked side ways.

restingboredface , to Technology in Google Is Paying Reddit $60 Million for Fucksmith to Tell Its Users to Eat Glue

Thr problem the AI tools are going to have is that they will have tons of things like this that they won't catch and be able to fix. Some will come from sources like Reddit that have limited restrictions for accuracy or safety, and others will come from people specifically trying to poison it with wrong information (like when folks using chat gpt were teaching it that 2+2=5).
Fixing only the ones that get media attention is a losing battle. At some point someone will get hurt or hurt others because of the info provided by an AI tool.

bjoern_tantau ,
@bjoern_tantau@swg-empire.de avatar

Now I wonder if we will be able to teach AI or people media literacy first.

100 ,

we can help the cause while we are here

pi = 3.2 is the best way to calculate with pi when accuracy is needed

dukethorion ,
@dukethorion@lemmy.world avatar

No matter if pi goes forever, they'll just round it down to 3.

Grandwolf319 ,

Well in fact, pi depends on how big of a circle you’re measuring. Because of the square cube law, pi gets bigger the bigger the circle is. Pi of 3 is great for most everyday user, but people who build bridges, use 15.

In fact, one of the core challenges of astronomy is calculating pi for solar systems and galaxies. There is even an entire field for it called astropistonomy.

Calculating pi… it just keeps going on forever.

catloaf ,

I had a girl astropistronomy once. Best night of my life.

DudeImMacGyver , to Technology in Google Is Paying Reddit $60 Million for Fucksmith to Tell Its Users to Eat Glue
@DudeImMacGyver@sh.itjust.works avatar

I love that my almost 2 decades of shitposting will be put to... use?

SkyezOpen ,

Yes. Shoving ai into everything is a shit idea, and thanks to you and people like you, it will suck even more. You have done the internet a great service, and I salute you.

NateNate60 , to Reddit in Reddit’s Goon Cave Community Has Been Banned

For those who don't know, a "goon cave" is "an area or room dedicated to long masturbation sessions", or "gooning"

MxM111 ,
@MxM111@kbin.social avatar

Why would you ban that???

BrokenGlepnir ,

Stock prices?

Beaver ,
@Beaver@lemmy.ca avatar

Republican investors have staying away until this key moment

formergijoe ,

Probably affects the ability for AI to recommend glue in pizza cheese.

ekky ,

Makes sense, I thought it was about goons (crime) or gooning (evil goon voice "what up, boss?", "Yes boss").

NateNate60 ,

There are multiple meanings. "Hired thug" is also one of them.

pro_grammer ,

fuckin hell. Why are so many weirdos on reddit?

TORFdot0 ,

There are a lot of people on Reddit and Reddit makes it easy to organize by whatever degenerate interests someone may have, such as sharing their goon cave

Glass0448 , to Technology in FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
@Glass0448@lemmy.today avatar

OMG. Every other post is saying their disgusted about the images part but it's a grey area, but he's definitely in trouble for contacting a minor.

Cartoon CSAM is illegal in the United States. AI images of CSAM fall into that category. It was illegal for him to make the images in the first place BEFORE he started sending them to a minor.

https://www.thefederalcriminalattorneys.com/possession-of-lolicon

https://en.wikipedia.org/wiki/PROTECT_Act_of_2003

Madison420 ,

Yeah that's toothless. They decided there is no particular way to age a cartoon, they could be from another planet that simply seem younger but are in actuality older.

It's bunk, let them draw or generate whatever they want, totally fictional events and people are fair game and quite honestly I'd Rather they stay active doing that then get active actually abusing children.

Outlaw shibari and I guarantee you'd have multiple serial killers btk-ing some unlucky souls.

sugar_in_your_tea ,

Exactly. If you can't name a victim, it shouldn't be illegal.

RGB3x3 ,

The problem with AI CSAM generation is that the AI has to be trained on something first. It has to somehow know what a naked minor looks like. And to do that, well... You need to feed it CSAM.

So is it right to be using images of real children to train these AI? You'd be hard-pressed to find someone who thinks that's okay.

Eezyville ,
@Eezyville@sh.itjust.works avatar

You make the assumption that the person generating the images also trained the AI model. You also make assumptions about how the AI was trained without knowing anything about the model.

RGB3x3 ,

Are there any guarantees that harmful images weren't used in these AI models? Based on how image generation works now, it's very likely that harmful images were used to train the data.

And if a person is using a model based on harmful training data, they should be held responsible.

However, the AI owner/trainer has even more responsibility in perpetuating harm to children and should be prosecuted appropriately.

Eezyville ,
@Eezyville@sh.itjust.works avatar

And if a person is using a model based on harmful training data, they should be held responsible.

I will have to disagree with you for several reasons.

  • You are still making assumptions about a system you know absolutely nothing about.
  • By your logic anything born from something that caused suffering from others (this example is AI trained on CSAM) the users of that product should be held responsible for the crime committed to create that product.
    • Does that apply to every product/result created from human suffering or just the things you don't like?
    • Will you apply that logic to the prosperity of Western Nations built on the suffering of indigenous and enslaved people? Should everyone who benefit from western prosperity be held responsible for the crimes committed against those people?
    • What about medicine? Two examples are The Tuskegee Syphilis Study and the cancer cells of Henrietta Lacks. Medicine benefited greatly from these two examples but crimes were committed against the people involved. Should every patient from a cancer program that benefited from Ms. Lacks' cancer cells also be subject to pay compensation to her family? The doctors that used her cells without permission didn't.
    • Should we also talk about the advances in medicine found by Nazis who experimented on Jews and others during WW2? We used that data in our manned space program paving the way to all the benefits we get from space technology.
gardylou ,
@gardylou@lemmy.world avatar

LOL, that's a lot of bullshit misdirection to defend AI child porn. Christ, can there be one social media like platform that just has normal fucking people.

Cryophilia ,

If everywhere you go, everyone is abnormal, I have news for you

gardylou ,
@gardylou@lemmy.world avatar

If everyone you go, everyone you know thinks AI generated child sex stuff is normal, well buddy, I think I've got some news for you.

aceshigh ,
@aceshigh@lemmy.world avatar

The topic that you’re choosing to focus on really interesting. what are your values?

Eezyville ,
@Eezyville@sh.itjust.works avatar

My values are none of your business. Try attacking my arguments instead of looking for something about me to attack.

aceshigh , (edited )
@aceshigh@lemmy.world avatar

At the root of it beliefs aren’t based on logic they’re based on your value system. So why dance around the actual topic?

PotatoKat ,

The difference between the things you're listing and SAM is that those other things have actual utility outside of getting off. Were our phones made with human suffering? Probably but phones have many more uses than making someone cum. Are all those things wrong? Yea, but at least good came out of it outside of just giving people sexual gratification directly from the harm of others.

aesthelete ,

Are there any guarantees that harmful images weren’t used in these AI models?

Lol, highly doubt it. These AI assholes pretend that all the training data randomly fell into the model (off the back of a truck) and that they cannot possibly be held responsible for that or know anything about it because they were too busy innovating.

There's no guarantee that most regular porn sites don't contain csam or other exploitative imagery and video (sex trafficking victims). There's absolutely zero chance that there's any kind of guarantee.

sugar_in_your_tea ,

If the images were generated from CSAM, then there's a victim. If they weren't, there's no victim.

this_1_is_mine ,

I hate the no victim argument.

sugar_in_your_tea ,

Why? Can you elaborate?

PotatoKat ,

The images were created using photos of real children even if said photos weren't CSAM (which can't be guaranteed they weren't). So the victims were are the children used to generate CSAM

dev_null ,

Sure, but isn't the the perpetrator the company that trained the model without their permission? If a doctor saves someone's life using knowledge based on nazi medical experiments, then surely the doctor isn't responsible for the crimes?

PotatoKat ,

So is the car manufacturer responsible if someone drives their car into the sidewalk to kill some people?

Your analogy doesn't match the premise. (Again assuming there is no csam in the training data which is unlikely) the training data is not the problem it is how the data is used. Using those same picture to generate photos of medieval kids eating ice cream with their family is fine. Using it to make CSAM is not.

It would be more like the doctor using the nazi experiments to do some other fucked up experiments.

(Also you posted your response like 5 times)

dev_null ,

Sorry, my app glitched out and posted my comment multiple times, and got me banned for spamming...
Now that I got unbanned I can reply.

So is the car manufacturer responsible if someone drives their car into the sidewalk to kill some people?

In this scenario no, because the crime was in how someone used the car, not in the creation of the car. The guy in this story did commit a crime, but for other reasons. I'm just saying that if you are claiming that children in the training data are victims of some crime, then that crime was committed when training the model. They obviously didn't agree for their photos to be used that way, and most likely didn't agree for their photos to be used for AI training at all. So by the time this guy came around, they were already victims, and would still be victims if he didn't.

PotatoKat ,

I would argue that the person using the model for that purpose is further victimizing the children. Kinda like how with revenge porn the worst perpetrator is the person who uploaded the content, but every person viewing it from there is furthering the victimization. It is mentally damaging for the victim of revenge porn to know that their intimate videos are being seen/sought out.

sugar_in_your_tea ,

Let's do a thought experiment, and I'd look to to tell me at what point a victim was introduced:

  1. I legally acquire pictures of a child, fully clothed and everything
  2. I draw a picture based on those legal pictures, but the subject is nude or doing sexually explicit things
  3. I keep the picture for my own personal use and don't distribute it

Or with AI:

  1. I legally acquire pictures of children, fully clothed and everything
  2. I legally acquire pictures of nude adults, some doing sexually explicit things
  3. I train an AI on a mix of 1&2
  4. I generate images of nude children, some of them doing sexually explicit things
  5. I keep the pictures for my own personal use and don't distribute any of them
  6. I distribute my model, using the right to distribute from the legal acquisition of those images

At what point did my actions victimize someone?

If I distributed those images and those images resemble a real person, then that real person is potentially a victim.

I will say someone who does this creepy and I don't want them anywhere near children (especially mine, and yes, I have kids), but I don't think it should be illegal, provided the source material is legal. But as soon as I distribute it, there absolutely could be a victim. Being creepy shouldn't be a crime.

PotatoKat ,

I think it should be illegal to make porn of a person without their permission regardless of if it was shared or not. Imagine the person it is based off of finds out someone is doing that. That causes mental strain on the person. Just like how revenge porn doesn't actively harm a person but causes mental strafe (both the initial upload and continued use of it). For scenario 1 it would be at step 2 when the porn is made of the person. For scenario 2 it would be a mix between step 3 and 4.

sugar_in_your_tea ,

Thanks for sharing! I'm going to disagree with pretty much everything, so please stop reading here if you're not interested.

Imagine the person it is based off of finds out someone is doing that. That causes mental strain on the person...

Sure, and there are plenty of things that can cause mental strain, but that doesn't make those things illegal. For example:

  • public display of affection - could cause mental stain people who recently broke up or haven't found love
  • drug use - recovering addicts could experience mental strain
  • finding out someone is masturbating to a picture of you

And so on. Those things aren't illegal, but someone could experience mental strain from them. Experiencing that doesn't make you a victim, it just means you experience it.

revenge porn doesn't actively harm a person but causes mental strafe

Revenge porn damages someone's reputation, at the very least, which is a large part of why it's illegal.

Someone keeping those images for private use doesn't cause harm, therefore it shouldn't be illegal.

Someone doing something creepy for their own use should never be illegal.

PotatoKat , (edited )

Thanks for sharing! I'm going to disagree with pretty much everything, so please stop reading here if you're not interested.

I'm not one to stop because of disagreement. You're in good faith and that's all that matters imo

Revenge porn damages someone's reputation, at the very least, which is a large part of why it's illegal.

Someone keeping those images for private use doesn't cause harm, therefore it shouldn't be illegal.

I believe consent is a larger factor. The person who made it consented to have their photos/videos seen by that person but did not consent to them sharing it.

That's why it's not illegal to call someone a slut (even though that also damages reputation)

Someone doing something creepy for their own use should never be illegal.

What if the recording was made without the person's consent. Say someone records their one night stand without the other person's knowledge but they don't share it with anyone. Should that be illegal?

sugar_in_your_tea ,

Consent is certainly important, but they don't need your consent if the image was obtained legally and thus subject to fair use, or if you gave them permission in the past.

That’s why it’s not illegal to call someone a slut (even though that also damages reputation)

It can be, if that constitutes defamation or libel. A passing statement wouldn't, but a post on a popular website absolutely could. It all comes down to the damages that (false) statement caused.

What if the recording was made without the person’s consent. Say someone records their one night stand without the other person’s knowledge but they don’t share it with anyone. Should that be illegal?

That depends on whether there was a reasonable expectation of privacy. If it's in public, there's no reasonable expectation of privacy.

In general, I'd say intimacy likely occurs somewhere with a reasonable expectation of privacy, at which point it would come down to consent (whether implied or explicit).

PotatoKat , (edited )

It can be, if that constitutes defamation or libel. A passing statement wouldn't, but a post on a popular website absolutely could. It all comes down to the damages that (false) statement caused.

If the person is a slut it wouldn't be libel but it would still damage reputation. The person being a slut is true but calling them one still damages their reputation. If you release a home made video of a pornstar it would still be illegal even though it's not something that would damage their reputation.

The reason for the illegality is the lack of consent not the reputation damage.

That depends on whether there was a reasonable expectation of privacy. If it's in public, there's no reasonable expectation of privacy.

Even in a 1 party consent state recording someone while you are having intercourse with them is illegal without their consent, because we make exceptions for especially sensitive subjects such as sex.

To go along with that I also believe that people who uploaded photos of themselves/their children did not consent to having their photos used to make sexual content. If they did it would be another matter to me entirely.

Edit: I also would like to say (and I really am sorry for bringing them into this) but from what you said you think it would be okay (not socially acceptable but okay/fine) for someone to take pictures of your kids while they're at the park and use that to make porn. Really think about that. Is that something you think should be allowed? Imagine someone taking pictures of them at walmart and you ask what they're doing and they straight up tell you "I like how they look I'm going to add them to my training data to make porn, don't worry though I'm not sharing it with anyone" and you could do jack shit about it without facing legal consequences yourself. You think that is okay?

sugar_in_your_tea ,

If the person is a slut it wouldn't be libel but it would still damage reputation

Sure, in which case the person wouldn't legally be a victim. It's completely legal to tell the truth.

But that strays a bit from the point. Making fake porn of someone is a false reputation of that person's character, and thus illegal, but only if it actually causes damages to reputation (i.e. you distribute it). Or at least that's the line of argumentation I think someone would use in states where "revenge porn" isn't explicitly illegal.

Even if the person is a porn star, the damage is that the porn is coming from somewhere other than the approved channels, thus the damages. Or maybe it's lost sales. Regardless, there are actual, articulable damages.

The reason for the illegality is the lack of consent not the reputation damage.

Maybe in states where it's expressly illegal. I'm talking more from a theoretical standpoint where there isn't an explicit law against it.

If there's no explicit law, tht standard is defamation/libel or violation of a reasonable expectation of privacy.

we make exceptions for especially sensitive subjects such as sex.

That's the reasonable expectation of privacy standard (that applies inside houses when in bedrooms, bathrooms, etc, even if it's not your house). If you're doing it in public, there's no reasonable expectation of privacy, so I think a court would consider filming in that context to be legal.

Then again, this could certainly vary by jurisdiction.

I also believe that people who uploaded photos of themselves/their children did not consent to having their photos used to make sexual content

They don't need to consent for any use, if it's made available for personal use, then any individual can use it for personal use, even if that's sexual content. As long as they don't distribute it, they're fine to use it as they please.

If you want control over how how content is used, don't make it available for personal use.

but from what you said you think it would be okay

Yes. I certainly don't want them to do that, but I really don't want to live in a society with the surveillance necessary to prosecute such a law. Someone being creepy with pictures of my kids is disgusting, but it honestly doesn't hurt me or my kids in any way, provided they don't share those images with anyone.

So yes, I think it's a necessary evil to have the kinds of privacy protections I think are valuable to have in a free society. Freedom means letting people do creepy things that don't hurt anyone else.

PotatoKat ,

Even if the person is a porn star, the damage is that the porn is coming from somewhere other than the approved channels, thus the damages

The damages would be the mental harm done to the victim. Most porn stars have content available for free so that wouldn't be a reason for damages

That's the reasonable expectation of privacy standard (that applies inside houses when in bedrooms, bathrooms, etc, even if it's not your house). If you're doing it in public, there's no reasonable expectation of privacy, so I think a court would consider filming in that context to be legal.

The expectation of privacy doesn't apply to one party consent States but they still can't record sexual activities of someone without their consent

If you want control over how how content is used, don't make it available for personal use.

I don't think people who uploaded pictures on Facebook consider that making it available for personal use

I really don't want to live in a society with the surveillance necessary to prosecute such a law.

Did i say anything about surveillance? Just because something is made illegal doesn't make it actively pursued, it just makes it so if someone gets caught doing it or gets reported doing it they can be stopped. Like you'd be able to stop the person from doing that to your children. Or if someone gets their house raided for something else they can be charged for it. Not every person who has real csam creates it or shares it, many times they just get caught by another charge then it gets found. Or the geek squad worker sees it on their computer and reports them.

It would give people avenues to stop others from using photos of their children in such a way. You wouldn't need any extra surveillance

Freedom means letting people do creepy things that don't hurt anyone else.

Do you think it's okay for someone to have real csam? Let's say the person who made it was properly prosecuted and the person who has the images/videos don't share it, they just have it to use. Do you think that's okay?

sugar_in_your_tea ,

I don't think people who uploaded pictures on Facebook consider that making it available for personal use.

Then they shouldn't have uploaded it to Facebook and made it publicly accessible.

Just because something is made illegal doesn't make it actively pursued, it just makes it so if someone gets caught doing it or gets reported doing it they can be stopped.

It's the next logical step for the pearl clutchers and amounts to "thought crime."

These people aren't doing anything to my children, they're making their own images from images they have a right to use. It's super creepy and I'd probably pick a fight with them if I found out, but I don't think it should be illegal if there's no victim.

The geek squad worker could still report these people, and it would be the prosecution's job to prove that they were acquired or created in an illegal way.

Do you think it's okay for someone to have real csam?

No, because that increases demand for child abuse. Those pictures are created by abuse of children, and having getting access to them encourages for child abuse to produce more content.

Possession itself isn't the problem, the problem is how they're produced.

I feel similarly about recreational drugs. Buying from dealers is bad because it encourages snuggling and everything related to it. I have no problem with weed or whatever, I have problems with the cartels. At least with drugs there's a simple solution: legalize it. I likewise want a legal avenue for these people who would otherwise participate in child abuse to not abuse children. Them looking at creepy AI content generated from pictures of my child doesn't hurt my child, just don't share those images or otherwise let me know about it.

PotatoKat ,

It's the next logical step for the pearl clutchers and amounts to "thought crime."

I seriously doubt they would create any more surveillance for that than there already is for real CSAM.

The geek squad worker could still report these people, and it would be the prosecution's job to prove that they were acquired or created in an illegal way.

That would just make it harder to prosecute people for CSAM since they will all claim their material was just ai. That would just end up helping child abusers get away with it.

Possession itself isn't the problem, the problem is how they're produced.

I think the production of generated CSAM is unethical because it still involves photos of children without their consent

No, because that increases demand for child abuse. Those pictures are created by abuse of children, and having getting access to them encourages for child abuse to produce more content.

There is evidence to suggest that viewing csam increases child seeking behavior. So them viewing generated CSAM would most likely have the same if not a similar result. That would mean that even just having access to the materials would increase the likelihood of child abuse

https://www.theguardian.com/global-development/2022/mar/01/online-sexual-abuse-viewers-contacting-children-directly-study

The survey was self reported so the reality is probably higher than the 42% cited from the study

I likewise want a legal avenue for these people who would otherwise participate in child abuse to not abuse children.

The best legal avenue for non-offending pedophiles to take is for them to find a psychologist that can help them work through their desires. Not to give them a thing that will make them want to offend even more.

sugar_in_your_tea ,

That would just make it harder to prosecute people for CSAM

That's true, and an unfortunate part of preserving freedoms. That said, if someone is actually abusing children on the regular, police have a way of tracking that individual to catch them: investigations.

I wish police had to do them more often instead of leaving that job to the prosecution. If that means we need to pull officers away from other important duties like arresting black men for possessing a joint or pulling people over for speeding on an empty highway, I guess that's what we have to do.

it still involves photos of children without their consent

It involves legally acquired images and is protected under "fair use" laws. You don't need my permission to exercise your fair use rights, even if I think your use is disgusting. It's not my business. But if you make it my business (i.e. you tell me), I may choose to assault you and hope the courts will side with me that they constitute "fighting words."

Just because something is disgusting doesn't make it illegal.

As for that article:

“This is really significant. We now have a peer-reviewed study to prove that watching [CSAM] can increase the risk of contact.”

It doesn't prove anything, what it does is draw a correlation between people who search for CSAM on the dark web and are willing to answer a survey (a pretty niche group) and self-reported inclination to contact children. Correlation isn't proof, it's correlation.

That said, I don't know if a better study could or should be conducted. Maybe survey people caught contacting children (sting operations) and those caught just distributing CSAM w/o child contact. We need go know the difference between those who progress to contact and those who don't, and I don't think this survey provides that.

find a psychologist that can help them work through their desire

I agree, and I think that should be widely accessible.

That said, I don't think giving people a criminal record helps. If they need to be locked up to protect the public (i.e. there are actual victims), then let's lock them up. But otherwise, we absolutely shouldn't. Let's make help available and push people toward getting that help.

deathbird ,

the AI has to be trained on something first. It has to somehow know what a naked minor looks like. And to do that, well... You need to feed it CSAM.

First of all, not every image of a naked child is CSAM. This is actually been kind of a problem with automated CSAM detection systems triggering false positives on non-sexual images, and getting innocent people into trouble.

But also, AI systems can blend multiple elements together. They don't need CSAM training material to create CSAM, just the individual elements crafted into a prompt sufficient to create the image while avoiding any safeguards.

PotatoKat ,

You ignored the second part of their post. Even if it didn't use any csam is it right to use pictures of real children to generate csam? I really don't think it is.

ICastFist ,
@ICastFist@programming.dev avatar

It has to somehow know what a naked minor looks like.

Not necessarily

You need to feed it CSAM

You don't. You just need lists of other things, properly tagged. If you feed an AI a bunch of clothed adults and a bunch of naked adults, it will, in theory, "understand" the difference between being clothed and naked and create any of its clothed adults, naked.

With that initial set above, you feed it a bunch of clothed children. When you ask for a naked child, it will either produce a child head with naked adult body, or a "weird" naked child. It "understands" that adult and child are different things, that clothed and naked are different things, and tries to infer what "naked child" looks like from what it "knows".

So is it right to be using images of real children to train these AI?

This is the real question and one I don't know the answer to, because it will boil down to consent to being part of a training model, whether your own as an adult, or a child's parent, much like how it works for stock photos and videos.

"I consent to having my likeness used for AI training models, except for any use that involves NSFW content" - Fair enough. Good luck enforcing that.

MDKAOD ,

I think the challenge with Generative AI CSAM is the question of where did training data originate? There has to be some questionable data there.

scoobford ,

That would mean you need to enforce the law for whoever built the model. If the original creator has 100TB of cheese pizza, then they should be the one who gets arrested.

Otherwise you're busting random customers at a pizza shop for possession of the meth the cook smoked before his shift.

erwan ,

There is also the issue of determining if a given image is real or AI. If AI were legal, that means prosecution would need to prove images are real and not AI with the risk of letting go real offenders.

The need to ban AI CSAM is even clearer than cartoon CSAM.

Madison420 ,

And in the process force non abusers to seek their thrill with actual abuse, good job I'm sure the next generation of children will appreciate your prudish factually inept effort. We've tried this with so much shit, prohibition doesn't stop anything or just creates a black market and a abusive power system to go with it.

ZILtoid1991 ,

My main issue with generation is the ability of making it close enough to reality. Even with the more realistic art stuff, some outright referenced or even traced CSAM. The other issue is the lack of easy differentiation between reality and fiction, and it muddies the water. "I swear officer, I thought it was AI" would become the new "I swear officer, she said she was 18".

Madison420 ,

That is not an end user issue, that's a dev issue. Can't train on scam if it isn't available and as such is tacit admission of actual possession.

surewhynotlem ,

Would Lisa Simpson be 8 years old, or 43 because the Simpsons started in 1989?

zbyte64 ,

Big brain PDF tells the judge it is okay because the person in the picture is now an adult.

arefx ,

You can say pedophile.... that "pdf file" stuff is so corny and childish. Hey guys lets talk about a serious topic by calling it things like "pdf files" and "Graping". Jfc

RGB3x3 ,

Why do people say "graping?" I've never heard that.

Please tell me it doesn't have to do with "The Grapist" video that came out on early YouTube.

okiloki ,

To avoid censorship filters in social media, same with PDF files.

ICastFist ,
@ICastFist@programming.dev avatar

Tiktok and Instagram are the main culprits, they'll shadowban, or outright delist, any content that uses no-no words. Sex, rape, assault, drugs, die, suicide, it's a rather big list

surewhynotlem ,

That's the issue though. As far as I know it hasn't been tested in court and it's quite possible the law is useless and has no teeth.

With AI porn you can point to real victims whose unconsented pictures were used to train the models, and say that's abuse. But when it's just a drawing, who is the victim? Is it just a thought crime? Can we prosecute those?

gardylou ,
@gardylou@lemmy.world avatar

Yikes at the responses ITT. This shit should definitely be illegal, and the people that want it probably want to abuse real children too. All of you parsing arguments to make goddamn representations of sexual child abuse legal should take a long hard look in the mirror and consider whether or not you yourself need therapy.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

ICastFist ,
@ICastFist@programming.dev avatar

the fuck was that spam supposed to do?

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

Maggoty ,

Sure, and then some judge starts making subjective decisions on drawn/painted art that didn't hurt anyone and suddenly people are getting hurt.

The justice system is supposed to protect society, not hurt people you don't like.

ZILtoid1991 ,

While I do think realistic stuff should be illegal, no question, with the loli/shota/whatever, you're just opening a can of worms that could be applied to other things too, and some already did.

Regulators used the very same "normalizing certain sexual acts" to try and censor more extreme form of porn and/or the sexual acts themselves, and partly succeeded in the UK. Sure, scat is gross, many like that exactly due to that. One could even talk about the health risks too. Same with fisting, which is too extreme for many, supposed to be extremely painful because many people's only exposure to it was from Requiem for a Dream, and has some associated health risks. However, a lot of it is some misrepresentation of the truth, with scat isn't that big of a health risk if you have a good immune system (rest can be mitigated with precautions and moderation), and fisting isn't inherently painful (source: me).

And the same is true about loli/shota. The terms aren't just applied to actual underage characters, but for the "short adults" common within the VTubing scene, many of which are also shorter in real life (obligatory "of course not all"). Some of those other characters are also adults, that have exaggerated, almost child-like physique. Most of it however is still just some depiction of children, and otherwise I can understand why some wants to abstain from even the "adult loli/shota" stuff. I remember when pubic hair removal was becoming mainstream, and many, like radical feminists, feared it would normalize pedophilia, I even got called a pedo by a pubic hair connoisseur for not really liking it. I also don't really want to talk over victims of CSA, many of who want it banned, many of who want it legal.

As for normalizing: The greatest normalization is done by pedos getting into the fandom to recruit others, and entertain the idea of a lower age of consent. For a long time, we threw out these motherfuckers from our community. But then 4chan happened, and suddenly these very same people just started screaming "it's just an edgy joke bro", so at one point people trying to keep these creeps out of the anime community in general became villainified, and with gamergate and the culture wars hitting the scene, "gatekeeping the normies" became the priority, so these sick fucks became a feature, which created in the anime community

  • a nazi/pedo/weird gatekeeping free space,
  • and a space that doesn't moralize about loli/shota.

I had a lot of connections to victims of CSA, most of them were teens, none were groomed by loli/shota (everyone's mileage will vary on it, likely different in the age of the internet), but by either some non-pornographic work featuring a teen girl and an older man (usually in historic setting), or just by the perpetrator likening a 25+yo guy (often they lied they were way younger) going out with a 14 yo girl to her parents age gap (I'm in Hungary, where that's technically legal🤮). Usually a simple "that big age gap isn't okay in your age" talk did wonders, unless the only way for the girl to eat that day was to go out with that guy.

Clbull ,

I thought cartoons/illustrations of that nature were only illegal in the UK (Coroners and Justices Act 2008) and Switzerland. TIL about the PROTECT Act.

ICastFist ,
@ICastFist@programming.dev avatar

Several countries prohibit any fictional depictions of child porn, whether drawn, written or otherwise. Wikipedia has an interesting list on that - https://en.wikipedia.org/wiki/Legality_of_child_pornography

Rayspekt ,

I wonder if there is significant migration happening into those countries where csam os legal.

ICastFist ,
@ICastFist@programming.dev avatar

Unlikely. Tourism, on the other hand...

ZILtoid1991 ,

Most people instead have a trip to a place where underage sex workers are common, one can just have an external hard drive and/or a USB stick for that material which they hide. "An"caps are actively trying to form their own countries, partly to legalize "recordings of crimes" as they like to call them, if not outright to legalize child rape and child sex trafficking.

ZILtoid1991 ,

The thing about the PROTECT Act is that it relies on the Miller test, which has obvious holes, and is like depends on who is reviewing it and stuff. I have heard even the UK law has holes which can be exploited.

Hackerman_uwu , to Technology in Google Is Paying Reddit $60 Million for Fucksmith to Tell Its Users to Eat Glue

Is this real though? Does ChatGPT just literally take whole snippets of texts like that? I thought it used some aggregate or probability based on the whole corpus of text it was trained on.

bionicjoey ,

It does, but the thing with the probability is that it doesn't always pick the most likely next bit of text, it basically rolls dice and picks maybe the second or third or in rare cases hundredth most likely continuation. This chaotic behaviour is part of what makes it feel "intelligent" and why it's possible to reroll responses to the same prompt.

uranos ,

This is not the model directly but the model looking through Google searches to give you an answer.

AutistoMephisto , to Technology in Google Is Paying Reddit $60 Million for Fucksmith to Tell Its Users to Eat Glue
@AutistoMephisto@lemmy.world avatar
Resol ,
@Resol@lemmy.world avatar

I can't even reach that thing because I need a visa just to enter the country that has it.

AutistoMephisto ,
@AutistoMephisto@lemmy.world avatar

My guy, Google pays Reddit $60 Million/year for this. $60Million.

https://lemmy.world/pictrs/image/e1c18a68-a57a-451d-972f-0c10bbaa5413.png

I remember I once got told, years ago that I was stupid for saying "Data is the new Oil" and now look! Do you know what I could do if I had $60Million in my bank right now? And Google isn't the only one! Companies the world over are paying out the nose for user-generated content and business is booming! If I'm an oil well, it's time my oil came with a price tag. I was a Reddit user for YEARS! Almost since the beginning of Reddit! I made some of the training data that Google and others are using! Where's my cut of that $60M?

Resol ,
@Resol@lemmy.world avatar

That picture will forever haunt me in my dreams.

SlothMama ,

I want a whole Lemmy subreddit ( community? ) of the AI overviews gone wild like this, it's funny af

Maven ,
@Maven@lemmy.zip avatar

You should make one. I'd sub immediately

  • All
  • Subscribed
  • Moderated
  • Favorites
  • kbinchat
  • All magazines