I don’t know what’s worse, the behavior of Air Canada’s Chatbot, or Air Canada’s hubris.
It all started with Jake Moffatt wanting to book a flight from Vancouver to Toronto for his grandmother’s funeral. Jake didn’t know how Air Canada’s bereavement rates worked, so he did what every company is forcing its customers to do, ask its online chatbot.
The Chatbot Was Wrong
Air Canada’s Chatbot provided Jake with the wrong information. It encouraged him to book a flight immediately, then ask for a refund within 90 days. So he did. Moffatt booked his flight to Toronto, then requested a refund. Air Canada rejected his claim.
If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.
Air Canada’s Chatbot
Air Canada doesn’t provide refunds for bereavement travel after a flight is booked. Moffatt tried to get a refund for months stating he followed the chatbot’s instructions. A chatbot employed by Air Canada. He even shared a screenshot of his conversation with Air Canada and they flat out refused.
Air Canada claims Moffatt should have read a link the chatbot provided about its bereavement policy in another response, and shouldn’t have taken the chatbot’s response on face value. Air Canada offered to give Moffatt a $200 coupon for another flight instead of a refund.
Moffatt took them to court.
A First of Its Kind Case
Moffatt filed a small claims complaint in Canada’s Civil Resolution Tribunal. Their small claims court. Experts told the Vancouver Sun it’s the first case of a Canadian company arguing it wasn’t liable for information provided by a chatbot. Air Canada’s position, if you could call it that, was that Moffatt should have never trusted the chatbot! That the airline should not be liable for the chatbot’s misinformation.
The chatbot is a separate legal entity that is responsible for its own actions.
Air Canada
Canada’s Civil Resolution Tribunal didn’t agree. They sided with Moffatt and called Air Canada’s defense, remarkable. The tribunal stated that Air Canada, “argues it cannot be held liable for information provided by one of its agents, servants, or representatives—including a chatbot,” but, “does not explain why it believes that is the case,” or “why the webpage titled ‘Bereavement travel’ was inherently more trustworthy than its chatbot.”
This fiasco is completely ridiculous. Companies are cheaping out on live, human support, and don’t want to be accountable for the actions of their automated servants. In the end, Air Canada refunded part of Moffatt’s airfare and paid for his court costs.
Relentless Pursuit of Automation
Air Canada’s CIO, Mel Crocker, has said Air Canada will automate every function that doesn’t require a human touch.
If Air Canada will use technology to solve something that can be automated, we will do that
Mel Crocker
In this case automation has gone to far, and caught the company a case costing them money.
Air Canada has disabled its chatbot. It just goes to show you, humans are still superior to AI.