404media.co

StaySquared , to Technology in FBI Arrests Man For Generating AI Child Sexual Abuse Imagery

I wonder if cartoonized animals in CSAM theme is also illegal.. guess I can contact my local FBI office and provide them the web addresses of such content. Let them decide what is best.

BruceTwarzen , to Technology in Men Use Fake Livestream Apps With AI Audiences to Hit on Women

Good for them. Dating apps are a nightmare for dudes. These days, and if someone is impressed by this, fair play.

flora_explora ,
@flora_explora@beehaw.org avatar

What? This is saying that the dudes already so detached from reality that they don't find any women should even further detach from reality. If you are an open-minded cis dude who respects women and sees them as equal human beings you'll have no problem finding anyone.

onlinepersona ,

Well that's definitely an... interpretation of what @BruceTwarzen wrote.

Anti Commercial-AI license

JackbyDev ,

Why do you label the link "Anti Commerical-AI licence" instead of "CC BY-NC-SA 4.0" as it is titled?

onlinepersona ,

Because people keep asking what the license is for.

Anti Commercial-AI license

flora_explora ,
@flora_explora@beehaw.org avatar

It seems very annoying to me when cishet dudes whine about how hard they have it. It might be true, but the problem is usually that they've been brought up with a misogynistic worldview and hegemonic masculinity. That's what I referred to by calling them detached from reality.

It is like a narcissistic person telling you how hard their life is while abusing you. You can empathize with them because they sure have a hard life. But as long as they're not self-aware and reflect on their doings, I won't have much empathy with them. Same goes for cis men.

onlinepersona ,

Wow... you are way too deep into whatever it is you're into and are currently unable to see the shades of gray.

Anti Commercial-AI license

flora_explora ,
@flora_explora@beehaw.org avatar

Nope, I disagree. I see shades of gray and have some few friends that happen to be cishet guys. But I know very few cis men that are not bigots, feel entitled, treat women like inferiors, etc.

But, do you agree with the original statement I replied to?

exocrinous ,

Narcisstic Personality Disorder is a lifelong disability with no cure. You can be a nice person with NPD. You can be a wise person with NPD. You can even be a healthy person with NPD, because disorders and illnesses are two different things. Having NPD is like having type 1 diabetes. You can live a normal life, but it's still going to take constant attention to treat, and some stuff is always going to be just a bit harder for you. The myth that people with NPD are abusers is pseudoscientific bigotry. There are plenty of self aware people with NPD and there are plenty of non-abusers with NPD. People with NPD are most likely to be the victims of abuse compared to the abuser, but the kind of people who prey on the disabled to satisfy their own desire for cruelty don't want you to believe that.

flora_explora ,
@flora_explora@beehaw.org avatar

Oh hello, you again! Sorry, won't discuss with you any further about this topic. Nothing new to be said and you newer replied to all the scientific studies I gave you why a high percentage of pwNPD tend to abuse others.

Emperor ,
@Emperor@feddit.uk avatar

If you are an open-minded cis dude who respects women and sees them as equal human beings you’ll have no problem finding anyone.

It's not always that simple. For example, I cared for my Dad 24/7 which involved a convoluted pill regime (and a series of alarms throughout the day). My social life took a real hit. There are also mental and physical health issues, as well as financial aspects.

All that said, anyone thinking this is the solution deserves to be scammed because it is hardly informed consent.

flora_explora ,
@flora_explora@beehaw.org avatar

OK, let me rephrase this into "if you are ... you will be as likely as cishet women to find someone to date". My point was that cishet men may have it hard to find someone because they are not catching up with progressive and emancipatory values. There are many many heteropessimisstic or otherwise frustrated women out there searching for a guy that does not treat them like shit.

But sure, if you don't have the capacity for a social life or for dating then obviously this won't be as easy. My comment was a response to the premise that cishet guys have it harder in dating and that they should be allowed to scam people.

Emperor ,
@Emperor@feddit.uk avatar

My point was that cishet men may have it hard to find someone because they are not catching up with progressive and emancipatory values.

It's worse than that - things seem to be regressing, with a widening political divide between men and women, especially noticeable in the younger adults.

It definitely feels like the modest progress that was made is now being eroded away

flora_explora , (edited )
@flora_explora@beehaw.org avatar

Yeah, it's pretty disheartening and even frightening. I don't know how to educate men on feminist ideas and get them on board. And being antifeminist doesn't even benefit them that much. There are so many men living their life miserable and ending up in jail because of their toxic masculinity and societal expectations of men. And I'm certainly interested in helping cis men get better, reconnect with their emotions and learn about emancipation. But at the same time I don't see how people who are not cis men can do so much to really help them. We are pretty busy surviving them and supporting each other.

Emperor ,
@Emperor@feddit.uk avatar

And being antifeminist doesn’t even benefit them that much.

The only people benefitting are those peddling the lies to disgruntled young men, partly as a grift and partly as misdirection from the real sources of their issues.

But at the same time I don’t see how much people who are not cis men can do so much to really help them.

And it shouldn't be your job to fix young men but I am as stumped as you and I worry about people like my nephew who is early teens.

There's !mensliberation but I don't know if that's not just preaching to the converted.

flora_explora ,
@flora_explora@beehaw.org avatar

Yes, it is really a frustrating situation. Since you seem to be a man, maybe you can be a good example to your nephew? But well, not so easy either unfortunately :(

gedaliyah , to World News in Russia has forked Wikipedia, featuring "better truths"
@gedaliyah@lemmy.world avatar

Why didn't they just start with Metapedia? It would have been a shorter walk.

Naich , to World News in Russia has forked Wikipedia, featuring "better truths"
@Naich@lemmings.world avatar

Should have forked Conservepedia - it's already fucking stupid to start with.

billiam0202 ,

You don't fork what's yours to start with.

Gaywallet , to Technology in Instagram Advertises Nonconsensual AI Nude Apps
@Gaywallet@beehaw.org avatar

I can't help but wonder how in the long term deep fakes are going to change society. I've seen this article making the rounds on other social media, and there's inevitably some dude who shows up who makes the claim that this will make nudes more acceptable because there will be no way to know if a nude is deep faked or not. It's sadly a rather privileged take from someone who suffers from no possible consequences of nude photos of themselves on the internet, but I do think in the long run (20+ years) they might be right. Unfortunately between now and some ephemeral then, many women, POC, and other folks will get fired, harassed, blackmailed and otherwise hurt by people using tools like these to make fake nude images of them.

But it does also make me think a lot about fake news and AI and how we've increasingly been interacting in a world in which "real" things are just harder to find. Want to search for someone's actual opinion on something? Too bad, for profit companies don't want that, and instead you're gonna get an AI generated website spun up by a fake alias which offers a "best of " list where their product is the first option. Want to understand an issue better? Too bad, politics is throwing money left and right on news platforms and using AI to write biased articles to poison the well with information meant to emotionally charge you to their side. Pretty soon you're going to have no idea whether pictures or videos of things that happened really happened and inevitably some of those will be viral marketing or other forms of coercion.

It's kind of hard to see all these misuses of information and technology, especially ones like this which are clearly malicious in nature, and the complete inaction of government and corporations to regulate or stop this and not wonder how much worse it needs to get before people bother to take action.

tim-clark ,
@tim-clark@kbin.social avatar

Flat earthers on the rise. I can only trust what i see with my eyes, the earth is flat!

How will this affect the courts? How can evidence be trusted?

Gaywallet ,
@Gaywallet@beehaw.org avatar

what

godzilla_lives OP ,
@godzilla_lives@beehaw.org avatar

I believe Tim means to say that the spread of misinformation can be linked to the rise of Flat Earthers. That if we can only trust what we see before us, and we see a flat horizon, we can directly interpret this visual to mean that the Earth is flat. Thus, if we cannot trust our own eyes and ears, how can future courtroom evidence be trusted?

"Up to the Twentieth Century, reality was everything humans could touch, smell, see, and hear. Since the initial publication of the chart of the electromagnetic spectrum, humans have learned that what they can touch, smell, see, and hear is less than one-millionth of reality." -Bucky Fuller

^ basically that

tim-clark ,
@tim-clark@kbin.social avatar

Exactly

some_guy ,

Bucky Fuller

Now I know the source of that sample in the Incubus song New Skin. I've been curious about that for two decades. Thanks!

DaseinPickle , to Reddit in AI Is Poisoning Reddit to Promote Products and Game Google With 'Parasite SEO'

How do we keep the bots out of Lemmy? Eventually all the AI rot will spread to the fediverse.

Max_P ,
@Max_P@lemmy.max-p.me avatar

Defederate/ban them, defederate instances that don't adequately stop them from their instances.

DaseinPickle ,

That might work for now, but AI bots can easily create more content faster than human moderators can go through. We will need mechanisms to prove humanness.

henfredemars ,

Another strategy might be to demand a minimum content quality. Whether trash comes from humans or bots, it’s still trash.

agressivelyPassive ,

Realistically, you don't. You can't really.

The only way would be to verify each account somehow.

DaseinPickle ,

Yea, it will be a hard problem to solve. There’s no good solutions right now. Hopefully someone with the creativity and skill will come up with something.

50MYT ,

It becomes a pain vs gain problem. How hard do you make it, also balancing the inconvenience.

You could easily force users to enter a one time code via email every 3 months (or more or less time). This would be hard to automate and if you changed it up even more so.

DaseinPickle ,

Or maybe enforce login with a hardware token like Yubikey or Nitrokey. I don’t think you could automate that(?)

50MYT ,

Yeah but how do you get it to someone.

Pain vs gain.

henfredemars ,

Are AI bots really the true problem, or is the problem product promotion? Humans can do that too.

DaseinPickle ,

Yea, but AI bots do it at an unprecedented scale. The price of generating bullshit is practically 0. There is already cheap AI content every where and people are already getting tired of it. It’s true that human actors has done something similar for a long time, but now AI will make the process so easy that AI junk will be everywhere. Look at Google search results, it’s already filled with blogspam generated by AI. Blogspam was a problem before, but now it’s insufferable. Cheap midjourney “art” is showing up in search results even when you search for real classic artist, you get these ugly AI copies. Generative AI is the ultimate bullshit and spam machine.

possiblylinux127 ,
@possiblylinux127@lemmy.zip avatar

We are a community. The community will find a way as we can do anything. We have faith of the heart and everything.

DaseinPickle ,

That sounds more like a prayer than a plan.

KingThrillgore , to Technology in AI Is Poisoning Reddit to Promote Products and Game Google With 'Parasite SEO'
@KingThrillgore@lemmy.ml avatar

Generative AI has really become a poison. It'll be worse once the generative AI is trained on its own output.

Simon ,

Here's my prediction. Over the next couple decades the internet is going to be so saturated with fake shit and fake people, it'll become impossible to use effectively, like cable television. After this happens for a while, someone is going to create a fast private internet, like a whole new protocol, and it's going to require ID verification (fortunately automated by AI) to use. Your name, age, and country and state are all public to everybody else and embedded into the protocol.

The new 'humans only' internet will be the new streaming and eventually it'll take over the web (until they eventually figure out how to ruin that too). In the meantime, they'll continue to exploit the infested hellscape internet because everybody's grandma and grampa are still on it.

Daxtron2 , to Technology in AI Is Poisoning Reddit to Promote Products and Game Google With 'Parasite SEO'

Bots have been doing this on Reddit forever, it's a lot more noticeable now that most of the good users who generated content have left.

dumples ,
@dumples@kbin.social avatar

I remember when I first started noticing bots bad was in 2016. They have been there for a while now. It's advertisers turn now though. It's at the end stages

uriel238 , to Technology in Instagram Advertises Nonconsensual AI Nude Apps
@uriel238@lemmy.blahaj.zone avatar

It remains fascinating to me how these apps are being responded to in society. I'd assume part of the point of seeing someone naked is to know what their bits look like, while these just extrapolate with averages (and likely, averages of glamor models). So we still dont know what these people actually look like naked.

And yet, people are still scorned and offended as if they were.

Technology is breaking our society, albeit in place where our culture was vulnerable to being broken.

echodot ,

I suspect it's more affecting for younger people who don't really think about the fact that in reality, no one has seen them naked. Probably traumatizing for them and logic doesn't really apply in this situation.

Grandwolf319 ,

I think half the people who are offended don’t get this.

The other half think that it’s enough to cause hate.

Both arguments rely on enough people being stupid.

postmateDumbass ,

How dare that other person i don't know and will never meet gain sexual stimulation!

inb4_FoundTheVegan ,
@inb4_FoundTheVegan@lemmy.world avatar

My body is not inherently for your sexual simulation. Downloading my picture does not give you the right to turn it in to porn.

CaptainEffort ,
@CaptainEffort@sh.itjust.works avatar

Did you miss what this post is about? In this scenario it’s literally not your body.

inb4_FoundTheVegan ,
@inb4_FoundTheVegan@lemmy.world avatar

There is nothing stopping anyone from using it on my body. Seriously, get a fucking grip.

inb4_FoundTheVegan ,
@inb4_FoundTheVegan@lemmy.world avatar

Wtf are you even talking about? People should have the right to control if they are "approximated" as nude. You can wax poetic how it's not nessecarily correct but that's because you are ignoring the woman who did not consent to the process. Like, if I posted a nude then that's on the internet forever. But now, any picture at all can be made nude and posted to the internet forever. You're entirely removing consent from the equation you ass.

Drewelite ,

Totally get your frustration, but people have been imagining, drawing, and photoshopping people naked since forever. To me the problem is if they try and pass it off as real. If someone can draw photorealistic pieces and drew someone naked, we wouldn't have the same reaction, right?

inb4_FoundTheVegan ,
@inb4_FoundTheVegan@lemmy.world avatar

It takes years of pratice to draw photorealism, and days if not weeks to draw a particular piece. Which is absolutely not the same to any jackass with an net connection and 5 minutes to create a equally/more realistic version.

It's really upsetting that this argument keeps getting brought up. Because while guys are being philosophical about how it's therotically the same thing, women are experiencing real world harm and harassment from these services. Women get fired for having nudes, girls are being blackmailed and bullied with this shit.

But since it's theoretically always been possible somehow churning through any woman you find on Instagram isn't an issue.

Totally get your frustration

Do you? Since you aren't threatened by this, yet another way for women to be harassed is just a fun little thought experiment.

Drewelite ,

Well that's exactly the point from my perspective. It's really shitty here in the stage of technology where people are falling victim to this. So I really understand people's knee jerk reaction to throw on the brakes. But then we'll stay here where women are being harassed and bullied with this kind of technology. The only paths forward, theoretically, are to remove it all together or to make it ubiquitous background noise. Removing it all together, in my opinion, is practically impossible.

So my point is that a picture from an unverified source can never be taken as truth. But we're in a weird place technologically, where unfortunately it is. I think we're finally reaching a point where we can break free of that. If someone sends me a nude with my face on it like, "Is this you?!!". I'll send them one back with their face like, "Is tHiS YoU?!??!".

We'll be in a place where we as a society cannot function taking everything we see on the internet as truth. Not only does this potentially solve the AI nude problem, It can solve the actual nude leaks / revenge porn, other forms of cyberbullying, and mass distribution of misinformation as a whole. The internet hasn't been a reliable source of information since its inception. The problem is, up until now, its been just plausible enough that the gullible fall into believing it.

nandeEbisu ,

Regardless of what one might think should happen or expect to happen, the actual psychological effect is harmful to the victim. It's like if you walked up to someone and said "I'm imagining you naked" that's still harassment and off-putting to the person, but the image apps have been shown to have much much more severe effects.

It's like the demonstration where they get someone to feel like a rubber hand is theirs, then hit it with a hammer. It's still a negative sensation even if it's not a strictly logical one.

UnderpantsWeevil ,
@UnderpantsWeevil@lemmy.world avatar

So we still dont know what these people actually look like naked.

I think the offense is in the use of their facial likeness far more than their body.

If you took a naked super-sized barbie doll and plastered Taylor Swift's face on it, then presented it to an audience for the purpose of jerking off, the argument "that's not what Taylor's tits look like!" wouldn't save you.

Technology is breaking our society

Unregulated advertisement combined with a clickbait model for online marketing is fueling this deluge of creepy shit. This isn't simply a "Computers Evil!" situation. Its much more that a handful of bad actors are running Silicon Valley into the ground.

uriel238 ,
@uriel238@lemmy.blahaj.zone avatar

Not so much computers evil! as just acknowledging there will always be malicious actors who will find clever ways to use technology to cause harm. And yes, there's a gathering of folk on 4Chan/b who nudify (denudify?) submitted pictures, usually of people they know, which, thanks to the process, puts them out on the internet. So this is already a problem.

Think of Murphy's Law as it applies to product stress testing. Eventually, some customer is going to come in having broke the part you thought couldn't be broken. Also, our vast capitalist society is fueled by people figuring out exploits in the system that haven't been patched or criminalized (see the subprime mortgage crisis of 2008). So we have people actively looking to utilize technology in weird ways to monetize it. That folds neatly like paired gears into looking at how tech can cause harm.

As for people's faces, one of the problems of facial recognition as a security tool (say when used by law enforcement to track perps) is the high number of false positives. It turns out we look a whole lot like each other. Though your doppleganger may be in another state and ten inches taller / shorter. In fact, an old (legal!) way of getting explicit shots of celebrities from the late 20th century was to find a look-alike and get them to pose for a song.

As for famous people, fake nudes have been a thing for a while, courtesy of Photoshop or some other digital photo-editing set combined with vast libraries of people. Deepfakes have been around since the late 2010s. So even if generative AI wasn't there (which is still not great for video in motion) there are resources for fabricating content, either explicit or evidence of high crimes and misdemeanors.

This is why we are terrified of AI getting out of hand, not because our experts don't know what they're doing, but because the companies are very motivated to be the first to get it done, and that means making the kinds of mistakes that cause pipeline leakage on sacred Potawatomi tribal land.

UnderpantsWeevil ,
@UnderpantsWeevil@lemmy.world avatar

This is why we are terrified of AI getting out of hand

I mean, I'm increasingly of the opinion that AI is smoke and mirrors. It doesn't work and it isn't going to cause some kind of Great Replacement any more than a 1970s Automat could eliminate the restaurant industry.

Its less the computers themselves and more the fear surrounding them that seem to keep people in line.

uriel238 ,
@uriel238@lemmy.blahaj.zone avatar

The current presumption that generative AI will replace workers is smoke and mirrors, though the response by upper management does show the degree to which they would love to replace their human workforce with machines, or replace their skilled workforce with menial laborers doing simpler (though more tedious) tasks.

If this is regarded as them tipping their hands, we might get regulations that serve the workers of those industries. If we're lucky.

In the meantime, the pursuit of AGI is ongoing, and the LLMs and generative AI projects serve to show some of the tools we have.

It's not even that we'll necessarily know when it happens. It's not like we can detect consciousness (or are even sure what consciousness / self awareness / sentience is). At some point, if we're not careful, we'll make a machine that can deceive and outthink its developers and has the capacity of hostility and aggression.

There's also the scenario (suggested by Randall Munroe) that some ambitious oligarch or plutocrat gains control of a system that can manage an army of autonomous killer robots. Normally such people have to contend with a principal cabinet of people who don't always agree with them. (Hitler and Stalin both had to argue with their generals.) An AI can proceed with a plan undisturbed by its inhumane implications.

UnderpantsWeevil ,
@UnderpantsWeevil@lemmy.world avatar

I can see how increased integration and automation of various systems consolidates power in fewer and fewer hands. For instance, the ability of Columbia administrators to rapidly identify and deactivate student ID cards and lock hundreds of protesters out of their dorms with the flip of a switch was really eye-opening. That would have been far more difficult to do 20 years ago, when I was in school.

But that's not an AGI issue. That's a "everyone's ability to interact with their environment now requires authentication via a central data hub" issue. And its illusionary. Yes, you're electronically locked out of your dorm, but it doesn't take a lot of savvy to pop through a door that's been propped open with a brick by a friend.

There’s also the scenario (suggested by Randall Munroe) that some ambitious oligarch or plutocrat gains control of a system that can manage an army of autonomous killer robots.

I think this fear heavily underweights how much human labor goes into building, maintaining, and repairing autonomous killer robots. The idea that a singular megalomaniac could command an entire complex system - hell, that the commander could even comprehend the system they intended to hijack - presumes a kind of Evil Genius Leader that never seems to show up IRL.

Meanwhile, there's no shortage of bloodthirsty savages running around Ukraine, Gaza, and Sudan, butchering civilians and blowing up homes with sadistic glee. You don't need a computer to demonstrate inhumanity towards other people. If anything, its our human-ness that makes this kind of senseless violence possible. Only deep ethnic animus gives you the impulse to diligently march around butchering pregnant women and toddlers, in a region that's gripped by famine and caught in a deadly heat wave.

Would that all the killing machines were run by some giant calculator, rather than a motley assortment of sickos and freaks who consider sadism a fringe benefit of the occupation.

iquanyin ,
@iquanyin@lemmy.world avatar

hmmm . i’m not sure we will be able to give emotion to something that has no needs, no living body, and doesn’t die. maybe. but it seems to me that emotions are survival tools that develop as beings and their environment develop, in order to keep a species alive. i could be wrong.

Nobody , to Technology in Instagram Advertises Nonconsensual AI Nude Apps

It’s all so incredibly gross. Using “AI” to undress someone you know is extremely fucked up. Please don’t do that.

MxM111 ,
@MxM111@kbin.social avatar

Can you articulate why, if it is for private consumption?

KidnappedByKitties ,

Consent.

You might be fine with having erotic materials made of your likeness, and maybe even of your partners, parents, and children. But shouldn't they have right not to be objectified as wank material?

I partly agree with you though, it's interesting that making an image is so much more troubling than having a fantasy of them. My thinking is that it is external, real, and thus more permanent even if it wouldn't be saved, lost, hacked, sold, used for defamation and/or just shared.

InternetPerson ,

To add to this:

Imagine someone would sneak into your home and steal your shoes, socks and underwear just to get off on that or give it to someone who does.

Wouldn't that feel wrong? Wouldn't you feel violated? It's the same with such AI porn tools. You serve to satisfy the sexual desires of someone else and you are given no choice. Whether you want it or not, you are becoming part of their act. Becoming an unwilling participant in such a way can feel similarly violating.

They are painting and using a picture of you, which is not as you would like to represent yourself. You don't have control over this and thus, feel violated.

This reminds me of that fetish, where one person is basically acting like a submissive pet and gets treated like one by their "master". They get aroused by doing that in public, one walking with the other on a leash like a dog on hands and knees.
People around them become passive participants of that spectactle. And those often feel violated. Becoming unwillingly, unasked a participant, either active or passive, in the sexual act of someone else and having no or not much control over it, feels wrong and violating for a lot of people.
In principle that even shares some similarities to rape.

There are countries where you can't just take pictures of someone without asking them beforehand. Also there are certain rules on how such a picture can be used. Those countries acknowledge and protect the individual's right to their image.

scarilog ,

Just to play devils advocate here, in both of these scenarios:

Imagine someone would sneak into your home and steal your shoes, socks and underwear just to get off on that or give it to someone who does.

This reminds me of that fetish, where one person is basically acting like a submissive pet and gets treated like one by their "master". They get aroused by doing that in public, one walking with the other on a leash like a dog on hands and knees. People around them become passive participants of that spectactle. And those often feel violated.

The person has the knowledge that this is going on. In he situation with AI nudes, the actual person may never find out.

Again, not to defend this at all, I think it's creepy af. But I don't think your arguments were particularly strong in supporting the AI nudes issue.

CleoTheWizard ,
@CleoTheWizard@lemmy.world avatar

In every chat I find about this, I see people railing against AI tools like this but I have yet to hear an argument that makes much sense to me about it. I don’t care much either way but I want a grounded position.

I care about harms to people and in general, people should be free to do what they want until it begins harming someone. And then we get to have a nuanced conversation about it.

I’ve come up with a hypothetical. Let’s say that you write naughty stuff about someone in your diary. The diary is kept in a secure place and in private. Then, a burglar breaks in and steals your diary and mails that page to whomever you wrote it about. Are you, the writer, in the wrong?

My argument would be no. You are expressing a desire in private and only through the malice of someone else was the harm done. And no, being “creepy” isn’t an argument either. The consent thing I can maybe see but again do you have a right not to be fantasized about? Not to be written about in private?

I’m interested in people’s thoughts because this argument bugs me not to have a good answer for.

Resonosity ,

Yeah it's an interesting problem.

If we go down the path of ideas in the mind and the representations we create and visualize in our mind's eye, to forbid people from conceiving of others sexually means there really is no justification for conceiving of people generally.

If we try to seek for a justification, where is that line drawn? What is sexual, and what is general? How do we enforce this, or at least how do we catch people in the act and shame them into stopping their behavior, especially if we don't possess the capability of telepathy?

What is harm? Is it purely physical, or also psychological? Is there a degree of harm that should be allowed, or that is inescapable despite our best intentions?

The angle that you point out regarding writing things down about people in private can also go different ways. I write things down about my friends because my memory sucks sometimes and I like to keep info in my back pocket for when birthdays, holidays, or special occasions come. What if I collected information about people that I don't know? What if I studied academics who died in the past to learn about their lives, like Ben Franklin? What if I investigated my neighbors by pointing cameras at their houses, or installing network sniffers or other devices to try to collect information on them? Does the degree of familiarity with those people I collect information about matter, or is the act wrong in and of itself? And do my intentions justify my actions, or do the consequences of said actions justify them?

Obviously I think it's a good thing that we as a society try to discourage collecting information on people who don't want that information collected, but there is a portion of our society specifically allowed to do this: the state. What makes their status deserving of this power? Can this power be used for ill and good purposes? Is there a level of cross collection that can promote trust and collaboration between the state and its public, or even amongst the public itself? I would say that there is a level where if someone or some group knows enough about me, it gets creepy.

Anyways, lots of questions and no real answers! I'd be interested in learning more about this subject, and I apologize if I steered the convo away from sexual harassment and violation. Consent extends to all parts of our lives, but sexual consent does seem to be a bigger problem given the evidence of this post. Looking forward to learning more!

CleoTheWizard ,
@CleoTheWizard@lemmy.world avatar

I think we’ve just stumbled on an issue where the rubber meets the road as far as our philosophies about privacy and consent. I view consent as important mostly in areas that pertain to bodily autonomy right? So we give people the rights to use our likeness for profit or promotion or distribution. And what we’re giving people is a mental permission slip to utilize the idea of the body or the body itself for specific purposes.

However, I don’t think that these things really pertain to private matters. Because the consent issue only applies when there are potential effects on the other person. Like if I talk about celebrities and say that imagining a celebrity sexually does no damage because you don’t know them, I think most people would agree. And so if what we care about is harm, there is no potential for harm.

With surveillance matters, the consent does matter because we view breaching privacy as potential harm. The reason it doesn’t apply to AI nudes is that privacy is not being breached. The photos aren’t real. So it’s just a fantasy of a breach of privacy.

So for instance if you do know the person and involve them sexually without their consent, that’s blatantly wrong. But if you imagine them, that doesn’t involve them at all. Is it wrong to create material imaginations of someone sexually? I’d argue it’s only wrong if there is potential for harm and since the tech is already here, I actually view that potential for harm as decreasing in a way. The same is true nonsexually. Is it wrong to deepfake friends into viral videos and post them on twitter? Can be. Depends. But do it in private? I don’t see an issue.

The problem I see is the public stuff. People sharing it. And it’s already too late to stop most of the private stuff. Instead we should focus on stopping AI porn from being shared and posted and create higher punishments for ANYONE who does so. The impact of fake nudes and real nudes is very similar, so just take them similarly seriously.

KidnappedByKitties ,

What I find interesting is that for me personally, writing the fantasy down (rather than referring to it) is against the norm, a.k.a. weird, but not wrong.

Painting a painting of it is weird and iffy, hanging it in your home is not ok.

It's strange how it changes along that progression, but I can't rightly say why.

Max_P , to Technology in Instagram Advertises Nonconsensual AI Nude Apps
@Max_P@lemmy.max-p.me avatar

Seen similar stuff on TikTok.

That's the big problem with ad marketplaces and automation, the ads are rarely vetted by a human, you can just give them money, upload your ad and they'll happily display it. They rely entirely on users to report them which most people don't do because they're ads and they wont take it down unless it's really bad.

alyth ,

The user reports are reviewed by the same model that screened the ad up-front so it does jack shit

Max_P ,
@Max_P@lemmy.max-p.me avatar

Actually, a good 99% of my reports end up in the video being taken down. Whether it's because of mass reports or whether they actually review it is unclear.

What's weird is the algorithm still seems to register that as engagement, so lately I've been reporting 20+ videos a day because it keeps showing them to me on my FYP. It's wild.

space ,

That's a clever way of getting people to work for them as moderators.

menemen , to Technology in Amazon Turkers Who Train AI Say They’re Locked Out of Their Work and Money
@menemen@lemmy.world avatar

Being a Turk myself: what do I have to do, to get my money?

After looking into this: is this name racist?

AbidanYre ,
BombOmOm , to Technology in Amazon Turkers Who Train AI Say They’re Locked Out of Their Work and Money
@BombOmOm@lemmy.world avatar

Amazon Turk pays like trash too. Though, if you have a favorable currency conversion ratio, might not be bad for you.

Looked into it a few years ago to see if I could use it for some extra spending money when I was bored. Wasn't worth it.

Evil_incarnate ,

Yeah, I played with it a couple of times to see what it was like. I now have about eighty cents in us currency sitting in my PayPal. Woo.

eveninghere , to Technology in Tumblr and Wordpress to Sell Users’ Data to Train AI Tools

It should be illegal for the company to own user-generated contents. They should at least pay the users.

FaceDeer ,
@FaceDeer@kbin.social avatar

They're giving you services in exchange for your contents.

Does nobody even think about TOS any more? You don't have to read any specific one, just realize the basic universal truth that no website is going to accept your contents without some kind of legal protection that allows them to use that content.

garrett ,
@garrett@infosec.pub avatar

You pay for WordPress.com though. That’s crazy to offer a paid service and use that data in AI training.

FaceDeer ,
@FaceDeer@kbin.social avatar

Hardly. They earn money by being paid by their users, but they can earn more money by being paid by their users and also selling their users' data. The goal is more money, so it makes sense for them to do that. It's not crazy.

From the WordPress Terms of Service:

License. By uploading or sharing Content, you grant us a worldwide, royalty-free, transferable, sub-licensable, and non-exclusive license to use, reproduce, modify, distribute, adapt, publicly display, and publish the Content solely for the purpose of providing and improving our products and Services and promoting your website. This license also allows us to make any publicly-posted Content available to select third parties (through Firehose, for example) so that these third parties can analyze and distribute (but not publicly display) the Content through their services.

Emphasis added. They told you what they could do with the content you gave them, you just didn't listen.

I'm sorry if I'm coming across harsh here, but I'm seeing this same error being made over and over again. It's being made frequently right now thanks to the big shakeups happening in social media and the sudden rise of AI, but I've seen it sporadically over the decades that I've been online. So it bears driving home:

  • If you are about to give your content to a website, check their terms of service before you do to see if you're willing to agree to their terms, and if you don't agree to their terms then don't give your content to a website. It's true that some ToS clauses may not be legally enforceable, but are you willing to fight that in court? If you didn't consider your content valuable enough to spend the time checking the ToS when you posted it, that's not WordPress's fault.
  • If you give someone something and they later find a way to make the thing you gave them valuable, it's too late. You gave it to them. They don't owe you a "cut." Check the terms of service.
garrett ,
@garrett@infosec.pub avatar

While you’re not wrong, the social contract we’ve adapted to is that paying means you have some sense of ownership. It’s unreasonable to expect folks to read every Terms of Service with their legalese. Perhaps the new reality we need to accept is that there is no such thing as a good actor on the internet.

FaceDeer ,
@FaceDeer@kbin.social avatar

Well, a large part of my frustration stems from the "I've seen this for decades" part - longer than many of the people who are now raising a ruckus have been alive. So IMO it's always been this way and the "social contract we've adapted to" is "the social contract that we imagined existed despite there being ample evidence there was no such thing." I'm so tired of the surprised-pikachu reactions.

Combined with the selfish "wait a minute, the stuff I gave away for fun is worth money to someone else now? I want money too! Or I'm going to destroy my stuff so that nobody gets any value out of it!" Reactions, I find myself bizarrely ambivalent and not exactly on the side of the common man vs. the big evil corporations this time.

garrett ,
@garrett@infosec.pub avatar

I don’t really disagree with you at all but repeatedly reminding us all that you’re “not surprised” isn’t the savvy commentary you think it is. Especially since it’s historically been the case that any service you pay money to has said “no, you own your content”.

The marker has just moved gradually on this with companies slowly adding more ownership clauses to their Terms of Service in ways that aren’t legible to average consumers. Now they’re cashing in on that ownership.

FaceDeer ,
@FaceDeer@kbin.social avatar

I'm just venting, really. I know it's not going to make a real difference.

I suppose if you go waaaay back it was different, true. Back in the days of Usenet (as a discussion forum rather than as the piracy filesharing system it's mostly used for nowadays) there weren't these sorts of ToS on it and everything got freely archived in numerous different places because that's just how it was. It was the first Fediverse, I suppose.

The ironic thing is that kbin.social's ToS has no "ownership" stuff in it either. For now, at least, the new ActivityPub-based Fediverse is in the same position that Usenet was - I assume a lot of the other instances also don't bother with much of a ToS and the posts get shared around beyond any one instance's control anyway. So maybe this grumpy old-timer may get to see a bit of the good old days return, for a little while. That'll be nice.

N0x0n ,

Just chiming in, sorry for my bad english.

Your comments are filling me with sadness and despair. You must be the kind of person who years ago warned about all this and most people just laught or called you some creepy tin-foilhat conspiracist. :/

The internet is changing very fast and not for the good. It's somehow comparable what's going on everywhere in the world... Greedy oppressors who only care about themselves, while millions people are suffering...

It feels somehow we have already lost...

FaceDeer ,
@FaceDeer@kbin.social avatar

If it makes you feel better, the thing that annoys me most is not so much that this is happening but more how everybody is suddenly surprised by it and complaining about it. The data-harvesting itself doesn't really harm anyone.

luciole , to Free and Open Source Software in This Guy Has Built an Open Source Search Engine as an Alternative to Google in His Spare Time
@luciole@beehaw.org avatar

For anyone wondering about how they'll eventually address financial sustainability if Stract takes off:

Stract is currently not monetized in any way, but its website says it will eventually have contextual ads tied to specific search terms but that it will not track its users, which is similar to the system DuckDuckGo uses. Stract also plans on offering ad-free searches to paying subscribers.

I'd pay for independent, non meta, ad-free search. I bet a more straightforward approach is more energy efficient as well. In the meanwhile the big tech are running a gazillion processes on our data to suck every bit of wealth they can out of our existence through their free (in it's littlest sense) products.

Corgana ,
@Corgana@startrek.website avatar

I’d pay for independent, non meta, ad-free search.

You might, but not enough people would to make it sustainable. Neeva was really well loved but couldn't make the math work.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • kbinchat
  • All magazines