Draftable Product Specialist and former Banking & Finance Solicitor at Herbert Smith Freehills, Yulia Gosper, shares her insights on how law firms can use generative AI tools like ChatGPT as a low-risk environment to explore and adapt to this new technology.
Generative AI technology has proven to be more than a passing trend since the launch of ChatGPT in November 2022, impacting all major industries including legal, healthcare, finance, education and more. However, there remains a healthy dose of scepticism about its use and limitations in the legal industry.
One of the driving factors behind this scepticism is that generative AI is probabilistic and highly prone to making mistakes known as hallucinations. It cannot discern whether the content it generates is correct, which is unacceptable in the legal field where legal advice requires a high degree of precision and even a small mistake can result in large financial and reputation losses.
These concerns have been fuelled by the recent controversial Stanford study on legal AI that found legal research tools hallucinate between 17% and 33% of the time.
The recent Allens AI Australian Law Benchmark study also found that market-leading large language models (LLMs), including ChatGPT, consistently generated incorrect and hallucinated output when asked to answer legal questions. The study concluded that the models tested should not be used for legal advice without expert human supervision, however, it’s also important to note the tested models have been trained on generalised data rather than specific Australian legislation and case law, so should not be expected to provide sophisticated Australian law legal advice.
While generative AI tools may not yet possess the reliability required by the legal industry, they’re not to be dismissed entirely as this technology is rapidly evolving and becoming omnipresent. Just two years ago, these tools and the skillset required to use them, didn’t even exist. Now, we are moving towards a future where generative AI capabilities will be integrated into virtually every software tool. We can already see it happening with Microsoft’s sweeping integration of CoPilot.
Law firms around the world are making AI moves. Recent surveys show that 53% of the top 200 US law firms have purchased generative AI tools and 45% use them for legal work, while 50% of legal professionals surveyed across Australia and New Zealand have used generative AI tools for daily operational tasks. The Draftable team is also finding that practical AI adoption is the most popular topic at legal tech events worldwide.
It will become crucial for law firms to understand and leverage this technology to maintain a competitive edge. Using AI tools could even soon be perceived as an ethical obligation for lawyers.
The New York State Bar Association (NYSBA) Task Force on Artificial Intelligence issued a Report and Recommendations in April 2024 on the legal, social, and ethical impact of AI and generative AI on the legal profession. The report identified several Rules of Professional Conduct (RPC) that can be implicated when using this technology in legal practice, including Comment 8 to RPC Rule 1.1 which asserts that “keeping abreast of the benefits and risks associated with technology the lawyer uses to provide services to clients is an element of competency.”
Nicole Yamane, Associate Attorney at Cades Schutte LLP, made a similar case in the Georgetown Journal of Legal Ethics, arguing that “a refusal to use technology that makes legal work more accurate and efficient may be considered a refusal to provide competent legal representation to clients.”
All this demonstrates that while it’s inappropriate to blindly rely on generative AI tools, it is time to start learning how to use this technology effectively.
Enter ChatGPT.
Using ChatGPT and similar tools like Google’s Gemini, Anthropic’s Claude and Microsoft’s CoPilot, offers a practical, low-risk environment for law firms to familiarise themselves with generative AI’s capabilities and limitations. It serves as a sandbox for experimentation, where lawyers and other legal professionals can easily interact with the technology across a range of applications such as researching and drafting documents or routine communications.
This can significantly enhance legal workflows in three ways:
Generative AI tools like ChatGPT are best used to streamline and automate routine processes such as:
While ChatGPT and similar tools like Gemini and Claude are considered a safer platform to learn the capabilities of generative AI, it’s not without risk entirely. Primary concerns include data privacy, inaccurate outputs, and the potential for misuse. Law firms will need to acknowledge and mitigate these risks as part of their literacy of these platforms. Here are some of the most common risks and general guidelines to help mitigate them:
As one of the NYSBA Task Force’s recommended guidelines, firms are urged to take precautions to protect sensitive client data when using generative AI tools like ChatGPT, as these tools involve sharing data with an external system. Users should be careful to only input generic or anonymised information and avoid any client-specific details, confidential information, or sensitive data.
In a Capital Brief interview, leading Australian law firm Clayton Utz says it “allows its staff to experiment with AI chatbots, with the stipulation that no sensitive information is given”.
If you are looking to purchase an AI-powered tool, you’ll need to assess the vendor’s data protection practices including how they store and process personal data and their overall security protocols and credentials. Read more about the 10 security questions you should be asking your legal tech vendors.
Some tools like ChatGPT also provide enterprise versions of their models through API access, which can be configured for more secure and private deployments according to the firm’s requirements.
At the recent Legal Innovation & Tech Fest in Sydney from 13-14 May, Allens CIO, Bill Tanner, shared how the firm deployed a custom LLM via Microsoft Azure, demonstrating how these types of custom solutions can be successfully implemented.
Read more key takeaways from the 2024 Legal Innovation & Tech Fest in Balancing AI and human connection: Practical insights from the Australian legal industry
The NYSBA Task Force emphasises that AI should not replace a lawyer's judgment but rather augment it. Lawyers must not become dependent on generative AI tools like ChatGPT but understand when to use or override the suggestions, as these tools are prone to hallucinations and producing outdated information.
Unlike traditional software which operates under more deterministic and rule-based frameworks, generative AI functions in a probabilistic manner, meaning that it generates outputs based on the likelihood of various possible outcomes, using statistical patterns learned from data to predict and create new content. Consequently, it is prone to generating incorrect answers known as hallucinations, particularly if the data it is trained on is ambiguous or of poor quality. Users should always thoroughly review and fact-check any AI-generated content.
Additionally, as generative AI tools like ChatGPT are trained on specific data, they have a cut-off date for accessing information. At the time of writing, ChatGPT-3.5’s knowledge cut-off is January 2022, while GPT-4’s cut-off is December 2023 and GPT-4o’s cut-off is October 2023. If you have a paid subscription, ChatGPT can browse the web for real-time information and provide direct links to sources, although output is still prone to errors and must be fact-checked.
To mitigate the risks of hallucinations, outdated information and over-reliance on AI, firms must establish guidelines for AI use and offer education and training on the responsible usage of AI tools.
Here’s a few tips on how to get the most out of generative AI tools like ChatGPT in your legal practice.
Instead of asking a single, complex question, break it down into smaller queries. This allows you to guide the conversation more effectively and refine the AI's output step-by-step.
Example:
Start with: "Draft a summary of this legal case."
Follow-up: "Can you highlight the key arguments made by the defence?"
Ask ChatGPT to adopt a specific role or persona to tailor its responses more closely to your needs. This can be particularly useful for training or generating content from a particular perspective.
Example: "Pretend you are a tax lawyer advising a client on the tax implications of selling their business in Australia. Provide an overview of the relevant tax considerations."
Request specific formatting for the outputs to make the information easier to read and use directly in documents.
Examples:
Ask ChatGPT to provide different viewpoints or to argue for and against a particular stance. This can help in understanding diverse perspectives and strengthening arguments.
Example: "What are the pros and cons of mandatory arbitration clauses in employment contracts?"
ChatGPT can extract specific information or summarise large datasets, which can be helpful for quick insights and decision-making.
Examples:
ChatGPT and similar tools remain one of the easiest and most accessible ways to learn how to use generative AI. Law firms that embrace these tools as a low-risk playground will not only see efficiency gains but also help prepare their teams to effectively leverage the generative AI capabilities that will be embedded in the vast majority of software platforms in the coming years.