Microw ,
@Microw@lemm.ee avatar

IMO these issues are mainly with the interface / how the AI summaries are presented.

The issue with incorrect answers like the glue on pizza one isnt "hallucination". The LLM is pulling that info from an existing webpage (The Onion). The thing they need to change is how that info is portrayed. Not "one tip is to use glue", but rather "the satirical site the Onion says to use glue".

Hallucination should be combatted by the fact that the AI cant show a proper source for facts it made up itself.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • kbinchat
  • All magazines