Air Canada's chatbot gave a B.C. man the wrong information. Now, the airline has to pay for the mistake ( bc.ctvnews.ca )

Jake Moffatt was booking a flight to Toronto and asked the bot about the airline's bereavement rates – reduced fares provided in the event someone needs to travel due to the death of an immediate family member.

Moffatt said he was told that these fares could be claimed retroactively by completing a refund application within 90 days of the date the ticket was issued, and submitted a screenshot of his conversation with the bot as evidence supporting this claim.

The airline refused the refund because it said its policy was that bereavement fare could not, in fact, be claimed retroactively.

Air Canada argued that it could not be held liable for information provided by the bot.

Skullgrid ,
@Skullgrid@lemmy.world avatar

Air Canada argued that it could not be held liable for information provided by the bot.

the (probably legally required) system we set up just straight up lied, not our fault.

Drusas ,

Are chatbots really legally required?

Skullgrid ,
@Skullgrid@lemmy.world avatar

I am assuming the customer should legally have a way to contact a company.

Companies try to make this obligation cost less and less by using automation and self service.

Source : worked on the customer service platform for a fortune 500 company.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • kbinchat
  • All magazines