Canadian judge: “AI is still no substitute for professional expertise”

A recent decision from British Columbia has added to the list of rulings where lawyers have been sanctioned for recklessly using AI tools for legal research. The clear message: a rushed set of materials which is not properly checked may lead to significant consequences.

In Zhang v Chen, 2024 BCSC 285 (CanLii), the Court ordered a Vancouver lawyer to personally compensate the other side’s legal costs based on the wasted time associated with her improper reliance on ChatGPT search results.

As part of Court materials submitted in family law litigation where she was counsel, the lawyer referred to two case citations for British Columbia cases which she claimed supported her client’s position. These cases were referred to under the heading “Legal Basis” to support the arguments in the materials.

When the opposing lawyer read the motion materials, they attempted to find the relevant cases referred to in the citations. When they were unable to do so, they asked the offending lawyer for copies.

Eventually, it emerged that the cases cited did not exist. Indeed, they were fake citations which had been created hallucinations by ChatGPT. In other words, the AI tool has generated references to what one might predict would be case references. But the cases as references did not and do not exist.

In addressing relevant issues, the judge noted that there did not appear to be any intent to deceive the Court (or the other side). Indeed, the relevant lawyer admitted what she had done and appeared to be apologetic. However, this did not stop the judge from ordering the lawyer to personally compensate the opposing legal team for their efforts required to address the issue.

As a related development, the Law Society of B.C. published a notice after this decision was issued, with a reminder to lawyer. This notice provides a reminder to lawyers that “the ethical obligation to ensure the accuracy of materials submitted to court remains with you.” In other words, AI tools do not allow lawyers to contract out of their professional obligations to the court, clients and others.

This ruling, which is similar to decisions from other jurisdictions, highlights the importance that generative AI tools be used with prudence and some level of caution. Lawyers are responsible for personally checking their work, and this remains a core professional obligation. Thus, while AI tools may offer many efficiencies, there can be serious consequences for those who cut corners and do not verify the accuracy of their work and related materials.

Previous
Previous

Ontario Workplace Legislation: Further Amended ESA Rules in Effect and to Come–Further Sequel (s)

Next
Next

Canadian Remote Workers: New Tax Guidance