Air Canada's chatbot gave a B.C. man the wrong information. Now, the airline has to pay for the mistake
Air Canada has been ordered to compensate a B.C. man because its chatbot gave him inaccurate information. The Civil Resolution Tribunal's decision on the dispute was posted online Wednesday(opens in a new tab), finding in favour of a man who relied on information provided by the bot about fares when he was booking a flight to attend his grandmother's funeral. "I find Air Canada did not take reasonable care to ensure its chatbot was accurate," tribunal member Christopher C. Rivers wrote, awarding $650.88 in damages for negligent misrepresentation. "Negligent misrepresentation can arise when a seller does not exercise reasonable care to ensure its representations are accurate and not misleading," the decision explains. Jake Moffatt was booking a flight to Toronto and asked the bot about the airline's bereavement rates – reduced fares provided in the event someone needs to travel due to the death of an immediate family member. Moffatt said he was told that these fares could be claimed retroactively by completing a refund application within 90 days of the date the ticket was issued, and submitted a screenshot of his conversation with the bot as evidence supporting this claim. He submitted his request, accompanied by his grandmother's death certificate, in November of 2022 – less than a week after he purchased his ticket. But his application was denied and the tribunal decision said emails submitted as evidence showed that Moffatt's attempts to receive a partial refund continued for another two-and-a-half months. The airline refused the refund because it said its policy was that bereavement fare could not, in fact, be claimed retroactively. In February of 2023, Moffatt sent the airline a screenshot of his conversation with the chatbot and received a response in which Air Canada "admitted the chatbot had provided 'misleading words.'" But Moffatt was still unable to get a partial refund, prompting him to file the claim with the tribunal. Story has more.<br/>
https://portal.staralliance.com/cms/news/hot-topics/2024-02-15/star/air-canadas-chatbot-gave-a-b-c-man-the-wrong-information-now-the-airline-has-to-pay-for-the-mistake
https://portal.staralliance.com/cms/logo.png
Air Canada's chatbot gave a B.C. man the wrong information. Now, the airline has to pay for the mistake
Air Canada has been ordered to compensate a B.C. man because its chatbot gave him inaccurate information. The Civil Resolution Tribunal's decision on the dispute was posted online Wednesday(opens in a new tab), finding in favour of a man who relied on information provided by the bot about fares when he was booking a flight to attend his grandmother's funeral. "I find Air Canada did not take reasonable care to ensure its chatbot was accurate," tribunal member Christopher C. Rivers wrote, awarding $650.88 in damages for negligent misrepresentation. "Negligent misrepresentation can arise when a seller does not exercise reasonable care to ensure its representations are accurate and not misleading," the decision explains. Jake Moffatt was booking a flight to Toronto and asked the bot about the airline's bereavement rates – reduced fares provided in the event someone needs to travel due to the death of an immediate family member. Moffatt said he was told that these fares could be claimed retroactively by completing a refund application within 90 days of the date the ticket was issued, and submitted a screenshot of his conversation with the bot as evidence supporting this claim. He submitted his request, accompanied by his grandmother's death certificate, in November of 2022 – less than a week after he purchased his ticket. But his application was denied and the tribunal decision said emails submitted as evidence showed that Moffatt's attempts to receive a partial refund continued for another two-and-a-half months. The airline refused the refund because it said its policy was that bereavement fare could not, in fact, be claimed retroactively. In February of 2023, Moffatt sent the airline a screenshot of his conversation with the chatbot and received a response in which Air Canada "admitted the chatbot had provided 'misleading words.'" But Moffatt was still unable to get a partial refund, prompting him to file the claim with the tribunal. Story has more.<br/>