These days, many companies use chatbots for customer service. You click on the chat button and type out your query, or choose from a set of options. The chatbot then tries to offer the right inputs. This obviously saves the cost of a real chat representative. In addition, AI also learns from previous interactions to make future interactions more positive for the user.
But, what happens if the chatbot gives incorrect information?
If the chatbot is a free service like Bing, we just laugh and perhaps post memes. But in this case, a passenger was misled on the refund that they would be eligible for. This led to a financial loss to the passenger – about 812 USD.
What happened?
In 2022, Joke Moffatt asked the chatbot on the company’s website what documents were needed to get the bereavement fare (a lower fare when someone is visiting family after a death).
The chatbot informed Mr. Moffatt that he could book a flight at full fare, but then get a refund of the fare difference if he applied for bereavement fare within 90 days of travel. This was incorrect. No refund is given for completed travel, as per the policy of Air Canada.
So, when Mr. Moffatt tried to apply for the refund, there was none forthcoming. Fortunately, he had saved the screenshots of the chatbot telling him this.
The matter reached a civil tribunal (a kind of lower court).
Air Canada said that the chatbot was a separate legal entity and that the company was not responsible for its actions.
The civil tribunal held that the company was responsible for everything on its website – including the chatbot.
Air Canada has been instructed to pay the passenger the fare difference. Air Canada must pay Moffatt CAD (Canadian Dollars) 650.88 for the fare difference, 36.14 in pre-judgment interest, and 125 in fees – making it a total payout of CAD 812.02