Air Canada was forced to pay compensation after the airline’s AI chatbot feature misinformed a customer in a conversation about bereavement fares.
The chatbot recommended that the passenger, Jake Moffat, book a ticket and then submit a ticket refund application to receive bereavement reimbursement, reports simpleflying.com.
According to theguardian.com, the chatbot assured Moffat that he would be refunded an amount for bereavement fare pricing.
When the passenger contacted the airline, the carrier said it had a different policy and offered a US$200 (R3 790) flight voucher for future travel. Moffat refused the voucher and filed a small claim complaint with the British Columbia Civil Resolution Tribunal.
The guidance provided by the chatbot was found to be inaccurate and not in line with Air Canada’s actual policy, which is provided on the same website. As per the policy, passengers can request a bereavement fare refund over the phone for upcoming flights that have not yet taken place.
The court sided with Moffat and ruled that Air Canada must provide a refund as the chatbot described. Additionally, the carrier had to pay for damages and tribunal fees.
In court, Air Canada denied liability for the information provided, making claims that the chatbot was a separate legal entity, explained a Tribunal member, Christopher Rivers. It also argued that the customer should have never trusted the chatbot, and should have double-checked the information on the website.
However, many industry experts feel that the carrier should have immediately owned the error and provided the partial refund expected by the passenger, especially because the AI chatbot, presented on the carrier’s website as a part of its customer services, provided the wrong information.
The case sets a precedent for future cases involving errors made by AI features.