dgmib ,

Most large corporations’ tech leaders don’t actually have any idea how tech works. They are being told that if they don’t have an AI plan their company will be obsoleted by their competitors that do; often by AI “experts” that also don’t have the slightest understanding of how LLMs actually work. And without that understanding companies are rushing to use AI to solve problems that AI can’t solve.

AI is not smart, it’s not magic, it can’t “think”, it can’t “reason” (despite what Open AI marketing claims) it’s just math that measures how well something fits the pattern of the examples it was trained on. Generative AIs like ChatGPT work by simply considering every possible word that could come next and ranking them by which one best matches the pattern.

If the input doesn’t resemble a pattern it was trained on, the best ranked response might be complete nonsense. ChatGPT was trained on enough examples that for anything you ask it there was probably something similar in its training dataset so it seems smarter than it is, but at the end of the day, it’s still just pattern matching.

If a company’s AI strategy is based on the assumption that AI can do what its marketing claims. We’re going to keep seeing these kinds of humorous failures.

AI (for now at least) can’t replace a human in any role that requires any degree of cognitive thinking skills… Of course we might be surprised at how few jobs actually require cognitive thinking skills. Given the current AI hypewagon, apparently CTO is one of those jobs that doesn’t require cognitive thinking skills.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • kbinchat
  • All magazines