top of page

And my lawyer is...ChatGPT? How AI and Law are becoming one

By Kiran Kaur


Three years ago, if I said the words CHATGPT,  I’m sure that you would have thought I was discussing some random new fictional book or movie. Nowadays, however, it seems as though the AI platform is all people are discussing. ChatGPT has infiltrated our everyday life – from students using it for their assignments and homework to people using it to make witty text message replies, or even help organise their to-do lists. I am sure that if you asked people how ChatGPT has reduced their workload, the comments would be overtly positive. However, with every positive, there is always a negative, and despite the significant aid that ChatGPT has provided society, we must sit down and reflect on the ways in which it has negatively altered institutions such as the legal system. This article will discuss how AI and the legal industry have unfortunately become intermingled with one another, and the negative repercussions this holds for the profession and the justice system as a whole. 


AI is understood as computer systems that are capable of performing complex tasks that were traditionally only able to be completed by humans. This means that AI computer systems are actively engaging in tasks that require reasoning, decision making, and problem solving. This does not mean that search engines or databases are defined as AI within this context, so don’t worry – using LexisNexis for your law assignment is not you using AI unethically.


Now, onto the impact that AI holds on the legal profession.  Historically, the legal profession has been seen as somewhat stagnant, containing the same mundane tasks, with varying precedents and case laws. However, the introduction of AI is argued to be “reshaping the legal profession” [1]. It is being used to review contracts, find relevant documents in the discovery process, and even conduct legal research, reducing the workload and the time lawyers spend on these tasks. AI is able to run thousands of analytics and patterns at a speed that humans are incapable of comprehending. When utilised within the legal profession, it is able to provide future predictions and insights, as well as work through various legal databases simultaneously to provide analytics on cases and scenarios. Further, when focusing on increasing productivity and removing historically mundane tasks, AI is seen to aid the work of practice management. Practice management concerns billing, social media management, legal research, or e-discovery. The use of AI in these aspects of a firm’s operation has been shown to allow for the improvement of document quality, as well as improving accuracy within tasks. 


However, the use of AI does not always come with benefits and as we will discuss throughout this article, there are various kinks within the implementation of AI, especially within the beginning stages of development. AI can theoretically reduce costly mistakes and increase attorney productivity, however, the use of AI is a slippery slope, and one that needs to be heavily monitored before implementation. Becoming reliant on the use of AI is certain to hold negative ramifications, with various studies noting that AI has not yet managed to achieve results with the same level of emotional intelligence and judgement as humans [2].  You might ponder that this retains the need for lawyers. However, there remains a risk that certain lawyers may become reliant on AI, with the lack of discretion making its way into the courtroom, bearing tragic results.


One prevalent example of this is the creation of fake case extracts and citations. Mata v Avianca (2023) is a US case in which lawyers submitted a brief containing fake case extracts and citations, researched using ChatGPT [3]. The consequences of this brief submission led to the dismissal of the case by the court, the lawyers were sanctioned for acting in bad faith, both the lawyers and their firm were fined, and their actions were exposed to public scrutiny. If you think this was just an isolated incident and that such issues are rare in the real world, think again. There have been other circumstances where generative AI has returned fake case information that was submitted to the court. Michael Cohen gave his lawyer cases generated by Google Bard believing that they were real and that the lawyer would fact-check them, both of which were not true [4]. These cases were included in a brief filed with the US Federal Court. Further, in Canada, two “AI hallucinations” were filed in an application to the court, leading to the lawyer having to personally compensate the opposing counsel for the time it took them to “learn” the invalid cases [5]. Finally, in the United Kingdom, 9 authorities that were put before the First-Tier Tribunal were found to be fakes, limiting the strength of the defence’s argument, and leading to resource inefficiency. Judge Redston stated that “that does not mean that citing invented judgements is harmless. It causes the tribunal and HMRC (HM Revenue & Customs) to waste time and public money, and this reduces the resources available to progress the cases of other court users”, emphasising the dangerous implications of AI [6]. While these cases of lawyer misjudgement when using AI may be limited for now, if this trend continues, consistent failures may mislead and overwhelm the courts, harm client interests, and undermine the rule of law. For the ANU Law students who have studied the compulsory course: Lawyers, Justice, and Ethics, you would know all about the in-depth rules and extensive guidelines that underpin the work of solicitors and the legal profession as a whole. One such example is 4.1.3 which highlights that the role of a solicitor is to deliver legal services competently, diligently, and promptly, in line with legal standards [7]. It is evident that AI, when misused in this way, fundamentally undermines these guidelines.


So, now if I asked you whether you would consider using ChatGPT within the legal field, what would your answer be?  It seems as though the lesson here is to use it with discretion and to ensure that human judgment underpins the final presented product. All in all, the development of AI and how we can use this software in law is definitely something to keep monitoring throughout the years to come.


 

Endnotes

  1. Stepka, M. (2022, February 21). Law Bots: How AI Is Reshaping the Legal Profession. Retrieved September 19, 2024, from Business Law Today from ABA website: https://businesslawtoday.org/2022/02/how-ai-is-reshaping-legal-profession/ 

  2. See: Becoming the AI-Enhanced Lawyer (2019) 38(2) University of Tasmania Law Review, 34-59 [2020] UNSWLRS 63.; LexisNexis. (2021). LexisNexis. Retrieved September 19, 2024, from Lexisnexis.com.au website: https://www.lexisnexis.com.au/en/insights-and-analysis/practice-intelligence/2018/Lawyer-vs-AI-A-legal-revolution 

  3. Mata v. Avianca, Inc., 22-cv-1461 (PKC) (S.D.N.Y. Jun. 22, 2023)

  4. Verma, P. (2023, December 29). Michael Cohen used fake cases created by AI in bid to end his probation. Retrieved September 19, 2024, from Washington Post website: https://www.washingtonpost.com/technology/2023/12/29/michael-cohen-ai-google-bard-fake-citations/ 

  5. Proctor, J. (2024, February 27). B.C. lawyer reprimanded for citing fake cases invented by ChatGPT. Retrieved September 19, 2024, from CBC website: https://www.cbc.ca/news/canada/british-columbia/lawyer-chatgpt-fake-precedent-1.7126393

  6. Rose, N. (2023, December 7). Litigant unwittingly put fake cases generated by AI before tribunal. Retrieved September 19, 2024, from Legal Futures website: https://www.legalfutures.co.uk/latest-news/litigant-unwittingly-put-fake-cases-generated-by-ai-before-tribunal

  7. Legal Profession Act 2006. https://www8.austlii.edu.au/cgi-bin/viewdb/au/legis/act/consol_act/lpa2006179/ 



 
 
 

Peppercorn is the official publication of the ANU Law Students’ Society (LSS). The views and opinions expressed in Peppercorn are those of their respective authors. They do not necessarily reflect the views and opinions of Peppercorn, the ANU LSS, or any sponsors. Due care has been taken to ensure the accuracy of each article, and that the views and opinions of any named individuals or organisations are taken into account where relevant.

Please contact lsspeppercorn@anu.edu.au if there are any issues. 

© 2024 by ANU Law Students' Society's Peppercorn Magazine

bottom of page