PeterBronez ,
@PeterBronez@hachyderm.io avatar

@along_the_road

“These were mostly family photos uploaded to personal and parenting blogs […] as well as stills from YouTube videos"

So… people posted photos of their kids on public websites, common crawl scraped them, LAION-5B cleaned it up for training, and now there are models. This doesn’t seem evil to me… digital commons working as intended.

If anyone is surprised, the fault lies with the UX around “private URL” sharing, not devs using Common Crawl

Gamers_Mate ,

I hope this causes a class action against these companies.

Fapper_McFapper ,

I’m so tired of AI.

criitz ,

Too bad, it's here forever...

remotelove ,

It's been around for a while. It's the fluff and the parlor tricks that need to die. AI has never been magic and it's still a long way off before it's actually intelligent.

frog , (edited )

The other thing that needs to die is hoovering up all data to train AIs without the consent and compensation to the owners of the data. Most of the more frivolous uses of AI would disappear at that point, because they would be non-viable financially.

autotldr Bot ,

🤖 I'm a bot that provides automatic summaries for articles:

Click here to see the summary

Photos of Brazilian kids—sometimes spanning their entire childhood—have been used without their consent to power AI tools, including popular image generators like Stable Diffusion, Human Rights Watch (HRW) warned on Monday.

The dataset does not contain the actual photos but includes image-text pairs derived from 5.85 billion images and captions posted online since 2008.

HRW's report warned that the removed links are "likely to be a significant undercount of the total amount of children’s personal data that exists in LAION-5B."

Han told Ars that "Common Crawl should stop scraping children’s personal data, given the privacy risks involved and the potential for new forms of misuse."

There is less risk that the Brazilian kids' photos are currently powering AI tools since "all publicly available versions of LAION-5B were taken down" in December, Tyler told Ars.

That decision came out of an "abundance of caution" after a Stanford University report "found links in the dataset pointing to illegal content on the public web," Tyler said, including 3,226 suspected instances of child sexual abuse material.


Saved 78% of original text.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • kbinchat
  • All magazines