Skip to content

Our Ontario Lawyers

When success matters, there is no substitute for the advantage that comes from experience.

Search for a lawyer below:


Search Results

We're sorry, We cannot locate any lawyers with that criteria. Please search again.

Sort By:

Experience and Expertise:

How Can We Help? We’ll be happy to match you to the right qualified Lerners Lawyer.

"It Was My Chatbot, Not Me"

4 minute read

Is a company liable when its chatbot gets it wrong? This issue recently made its way to British Columbia’s Civil Resolutions Tribunal (BCCRT) in Moffatt v Air Canada, 2024 BCCRT 149. The BCCRT has jurisdiction over “small claims” of up to $5,000.


Jake Moffatt (they/them), booked an Air Canada flight from British Columbia to Ontario following their grandmother’s death. An interactive Air Canada support chatbot advised them that they could seek a retroactive “bereavement” fare. When Mr. Moffatt later attempted to secure a refund of the difference between the two fares, they were advised by a (human) agent that Air Canada did not offer retroactive bereavement refunds. The price difference between the regular fare and the bereavement fare was $880.36, although including fees the difference was $650.88.

In its defence, Air Canada claimed that it could not be held liable for misinformation provided by the chatbot, the accurate information was available on its website, and Mr. Moffatt did not follow the proper procedure to request a bereavement fare. It also cited contractual terms, but did not provide relevant portions of the contract.


The issue at the heart of this dispute was whether or not Air Canada had negligently misrepresented the bereavement fare process through its chatbot. It was agreed that, when Mr. Moffatt booked a flight, they did interact with a chatbot that gave them advice about bereavement fares, which included “misleading words”. The chatbot’s advice included the following:

Air Canada offers reduced bereavement fares if you need to travel because of an imminent death or a death in your immediate family.

If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form. (emphasis in original)

The term “bereavement fares” was hyperlinked to an Air Canada webpage titled “Bereavement travel” which contained information about the bereavement travel policy. The webpage explained that the policy does not include retroactive bereavement fares.  There was an inconsistency between the chatbot advice and the information at the linked website page. Mr. Moffatt did not review the hyperlinked webpage.


The BCCRT found that Mr. Moffatt had relied on the chatbot, and they had, in effect, alleged negligent misrepresentation on the part of Air Canada:

Negligent misrepresentation can arise when a seller does not exercise reasonable care to ensure its representations are accurate and not misleading (para 24).

The BCCRT held that the test for negligent misrepresentation was satisfied:

  1. Air Canada owed Mr. Moffatt a duty of care;
  2. Air Canada’s representation, through the chatbot, was untrue, inaccurate or misleading;
  3. Air Canada made the representation negligently;
  4. Moffatt relied on the representation; and
  5. Moffatt’s reliance resulted in a loss.

Incredibly, Air Canada attempted to argue that it was not liable for misinformation provided by an agent, servant or representative, including a chatbot. The absurdity of Air Canada’s position was not lost on the Tribunal which commented:

In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission (para 27).

The Tribunal had no trouble finding that Air Canada had failed to take reasonable care to ensure that its chatbot was providing accurate information. It held that it was unreasonable to expect website visitors to double check information found in one part of the website (a chatboth) with another part of the website to ensure accuracy.


Air Canada was ordered to pay Mr. Moffatt a total of $812.02. While this might be only a small raindrop in a vast sky for Canada’s largest airline, the decision is embarrassing and highlights the commercial risks of relying on artificial intelligence. It is also significant for what it is missing: any effort on Air Canada’s part to explain how the chatbot worked, and why it provided bad information. In the absence of an explanation, the BCCRT easily landed on a finding of negligence.

As sophisticated as software applications have become, they do get things wrong – and they will not be held liable for their “own” actions.

LERNx Sidebar


Our lawyers are committed to making the law easier to access for all by publishing high-quality and industry-leading content.

Carolyn McKeen

We are here to help.

Do you have any questions about your unique scenario? Feel free to reach out directly by visiting my Lerners Profile View My Full Profile