tatterdemalion ,
@tatterdemalion@programming.dev avatar

It literally cannot come up with novel solutions because it's goal is to regurgitate the most likely response to a question based on training data from the internet. Considering that the internet is often trash and getting trashier, I think LLMs will only get worse over time.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • kbinchat
  • All magazines