Feathercrown , (edited )

And yet you choose to spread this information.

Anyways, this is a fascinating thought experiment, but it does have some holes similar to Pascal's Wager. I propose Feather's Mongoose: A hypothetical AI system that, if created, will punish anyone who attempted to create Roko's Basilisk, and will ensure that it is not created. In fact, you could make this same hypothetical for an AI with any goal-- therefore, it's not possible to know what the AI that is actually created would want you to do, and so every course of action is indeterminately damning or not.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • kbinchat
  • All magazines