12 Days of ChristmAIs: A TMT insight series
As with almost all other areas of law and life, there is certainly scope for AI to provide useful assistance to the conduct of litigation, including by dealing with the dreaded problem of blank page and making prose read in a more eloquent, clear, and compelling manner.
While much of the profile-grabbing headlines in relation to the use of AI in litigation focuses on litigants referring to ‘hallucinating’ cases and legislation which simply do not exist, a lesser discussed but equally critical area is how AI can – and cannot – be used in the preparation of evidence.
Evidence in civil litigation
There are two common types of written evidence in civil litigation – lay witness evidence (provided through affidavits and witness statements), and expert evidence (provided through expert reports).
An affidavit or witness statement is a written statement in which the witness relays their evidence as to what has factually occurred, and exhibits any documents which support their statement.
An expert report is a report prepared by an expert, by which they provide their expert opinion. Some common examples include medical reports, engineering reports, forensic accounting reports, and valuations.
As the use of AI continues to evolve, the State and Federal Courts of Australia have started issuing guidance and practice directions as to how AI can – and can not – be used in preparing witness and expert evidence. This article considers the guidance and directions provided by the Queensland and NSW Supreme Courts.
Queensland’s guidance on the use of generative AI
The Queensland Courts have issued a guidance note on the Use of Generative AI by Non-Lawyers (Qld Note),1 and the Supreme Court of Queensland has issued a practice direction in relation to the Use of Expert Evidence in Criminal Proceedings (Qld Practice Direction).2 There is not yet a similar practice direction for civil proceedings.
The Qld Note provides guidance only, and advises non-lawyers to be cautious about using generative AI to prepare affidavits or witness statements, and that it is important to ensure that the document ‘is sworn/ affirmed or finalised in a manner that accurately reflects the person’s own knowledge and words’.
Conversely, the Qld Practice Direction provides prescriptive requirements in relation to the use of AI in relation to preparing expert reports in criminal proceedings, and specifically requires that where the expert has used generative AI to assist in the formulation or expression of the opinions contained in their report, the report must:
- specify the name of the generative AI program used, and how it was used;
- disclose (as an annexure to the report) a complete record of the inputs / prompts used for the generative AI program to formulate or express the relevant opinions, including any source material, default values and/or variable sets;
- disclose (as an annexure to the report) a complete record of the outputs delivered by the generative AI program in the formulation or expression of the relevant opinions;
- specify if the way in which the generative AI program was used is regulated or addressed by any relevant code of practice that binds the expert and, if so, how that code of practice was adhered to by the expert in the formulation or expression of the relevant opinions; and
- identify any possible biases or other known limitations that might affect the accuracy or reliability of the opinions formulated or expressed through use of the generative AI program.
While this practice direction applies only to criminal proceedings, it provides a helpful indication as to types of requirements the courts may impose in civil proceedings in the near future.
New South Wales guidance on the use of AI
Similarly, the New South Wales Supreme Court has issued a practice note on the Use of Generative Artificial Intelligence (NSW Practice Note),3 which applies to all proceedings (and not just criminal proceedings).
Unlike the Qld Practice Direction, the NSW Practice Note provides that generative AI must not be used to generate the content of:
- an affidavit or a witness statement, including by altering, embellishing, strengthening, diluting or rephrasing a witness’s evidence;
- an annexure or exhibit to an affidavit or a witness statement (without leave of the court); and
- an expert’s report (without leave of the court).
Affidavits and witness statements must include a statement that AI was not used to generate its content, or the content of any annexures or exhibits (except to the extent that leave was given by the court).
Where the Court allows the use of generative AI in an expert report, the expert witness must:
- disclose the parts of the report prepared using generative AI, including which generative AI program was used to generate the content of the report (including which version);
- keep records and identifying in an annexure to the report how the generative AI tool or program was used, including for example, any prompts used, any default values used, and any variables set; and
- if the use of generative AI is regulated or addressed by any relevant code of practice or principles that bind or apply to the expert, identify that fact and annex to the report a copy of the relevant codes or principles.
These requirements bear similarity to the requirements of the Qld Practice Direction.
Other areas to keep an eye on
As AI is still an emerging technology, Australian courts are largely yet to be tested when it comes to the reliability and admissibility of evidence that has been prepared with AI.
As the ability to generate videos, images, and voice recordings through AI becomes increasingly proficient and generally available to the public, courts may be faced with considering whether it is necessary to verify that the evidence has not been generated through AI and, if so, how that verification can occur.
This problem is just starting to emerge in courtrooms in Australia, with the question of reliability and admissibility being considered by the Federal Circuit and Family Court of Australia in February 2025 in the matter of Barry v Letton [2025] FedCFam C2F 222. In this case, the applicant sought to tender audio recordings of phone conversations and sought to rely on transcripts of the phone recordings which had been prepared using AI software.
Ultimately, the Court did not allow the transcripts for a number of reasons, which relevantly included because they had not been verified to be accurate. In making this decision, the court commented that it was commonly known that AI tools have a tendency to hallucinate or be inaccurate or incomplete in their outputs.4 There was no discussion as to how the transcripts should have been verified.
This article forms part of the series, the 12 Days of ChristmAIs: A Technology, Media and Telecommunications series on artificial intelligence and its intersection with the law. You can view all the articles here.
1 The use of Generative Artificial Intelligence (AI): Guidelines for responsible use by non-lawyers.
2 Practice Direction 2024/14 – Expert Evidence in Criminal Proceedings – AMENDED.
3 Supreme Court Practice Note SC Gen 23, titled ‘Use of Generative Artificial Intelligence (Gen AI).
4 Barry v Letton [2025] FedCFam C2F 222 at [13] and [14].