Artificial Intelligence in Victoria’s Courts and Tribunals: Consultation Paper

4. AI in courts and tribunals

Overview

• This chapter outlines the current and potential use of artificial intelligence (AI) in courts and tribunals.

• The experiences of other jurisdictions are provided to illustrate the opportunities and risks posed by different applications of AI.

• This chapter considers the use of AI in:

– general court processes and advice

– in-court hearings

– alternatives to in-court hearings

– judicial decision-making.

• AI will create different opportunities and risks for various court and tribunal stakeholders, based on the kind of AI tool and the ways it is used or misused.

General court processes and advice

Table 1: Current and potential uses of AI in general court processes and advice

Court users

Courts and Tribunals

Uses

Public

SRLs

Legal professionals

Prosecutorial bodies

Court administrators

Judges/ Tribunal members

Juries

e-Filing and allocation of court matters

Case analysis and legal research

Engaging court users

Drafting documents

Predictive analytics

Virtual legal advice

Key

User: Uses an AI system to produce a product or service

Host: Delivers and maintains the AI system

Receiver: Relies on material or services produced by AI

E-filing and allocation of court matters

4.1 ‘Workflow automation’ means using technology to streamline and automate repetitive tasks.[1] It involves software systems that can perform actions automatically, reducing the need for manual intervention.

4.2 Victorian courts and tribunals use automation for some workflow processes including case management.[2] Victorian courts have used e-filing systems for electronic submission of documents for over two decades, but this process does not use AI.[3] Some online systems integrate different court responsibilities into a single portal. The New South Wales Online Court and Online Registry integrates e-filing, case allocation and case management.[4]

4.3 AI might play a role in providing more advanced case management and advice for Victorian court and tribunal users than current systems.[5] E-filing and case management tools have developed from expert systems to include machine learning. This can provide more advanced functions including classifying, extracting and redacting documents.[6]

4.4 The Brazilian VICTOR project supports case allocation by using natural language processing to filter appeals.[7] This tool is expected to reduce the time required for court staff to classify applications for appeal. It automatically filters appeals depending on whether the appeal meets the constitutional threshold to be heard by the Supreme Court and the relevant areas of law.[8]

Opportunities

Risks

• Reduction in the workload of court staff by automating repetitive administrative tasks.

• Improved access to justice. AI-facilitated case management can assist court administrators to process more cases in a timely and efficient manner.

• Reduction in court workload if cases are automatically directed to an appropriate court or judge.

• The handling of large amounts of sensitive, personal court data by AI has significant data security risks.

• Overreliance on AI may lead to a reduction in overall human oversight of case management and a decline in staff case management skills.

Case analysis and legal research

4.5 AI-powered legal research and analysis tools are changing the way case analysis and legal research is done. AI can improve people’s ability to quickly access and analyse large amounts of legal information. AI providers claim their tools can sift through legal databases and identify and summarise legal arguments faster than traditional research tools.[9]

4.6 These types of AI tools can be accessed via licences, and could be used by judicial officers, court staff (including judges’ associates and court researchers) and legal professionals.

4.7 AI research and analysis tools can incorporate natural language processing and generative AI functions. This allows people to search in conversational English and create summaries of information.[10] In contrast, traditional legal research tools use key word or Boolean searches (search terms such as ‘and’ or ‘or’). It has been argued that traditional search tools are not good at making sense of legal data.[11]

4.8 Providers of specialised legal AI services claim that AI increases accuracy and reliability in case analysis and legal research. But law-specific AI products may have hallucination rates between 17 and 33 per cent.[12] In the United States, legal professionals who have relied on large language models such as ChatGPT for legal research have found that it sometimes hallucinates fake cases and precedents.[13]

Opportunities

Risks

• AI can identify, analyse and summarise complex information much faster than a human.

• Errors or hallucinations.

• Reduced access to justice if specialist tools are cost prohibitive for small firms, not-for-profit organisations and the public.

• Overreliance on AI may undermine the training of junior court staff and junior lawyers.

Engaging and triaging potential court users

4.9 AI offers new opportunities to engage court and tribunal users. AI-powered virtual court assistants could provide people with immediate, accessible information and guidance about how the court works.

4.10 Natural language processing tools, including chatbots, can support court functions. Chatbots can mimic human conversation through text or voice interactions.[14] Some chatbots use large language models and generative AI to generate content based on inputs.[15] Chatbots trained on court procedures and guidance could provide people with general information about court processes and timelines, advice on preparing procedural documents, or information about where to go next.[16]

4.11 AI chatbots can assist with language translation, providing support for marginalised communities with low literacy skills or with hearing or sight impairments (see discussion on transcription and translation tools from 4.43).

4.12 In Michigan, in 2018, the Court Operated Robot Assistant (CORA) was implemented as a physical courthouse ‘concierge’ to assist court users.[17] CORA provides directions, searchable court dockets and responses to frequently asked questions through voice-to-voice interaction in Spanish and English.[18]

4.13 Chatbots can assist with automated payment of fines.[19] Since 2016, an AI-powered online assistant ‘Gina’ has been used in California to advise motorists how to pay traffic tickets, schedule court dates and sign up for traffic lessons.[20] Gina can also translate information into six languages.[21]

4.14 AI systems present opportunities to more efficiently triage people’s legal issues and connect them with legal advice and representation.[22]

4.15 Justice Connect, in partnership with the University of Melbourne, developed a natural language processing AI model to triage and identify people’s legal problems.[23] It allows people to use plain English to describe their problem and suggests a legal category for that issue.[24] Justice Connect uses this tool to support online client intake by matching people with an appropriate legal service.

Opportunities

Risks

• Legal information is provided in plain English to simplify the legal process.

• Advice is accessible at a time convenient to court users, including when court staff are not around.

• Errors are identified in drafts of court documents early.

• Court users can be prompted to submit outstanding documents.

• Provision of outdated or inaccurate information to court users if the system is not continuously maintained.

• Privacy risks if users’ personal information is used to train the model.

Rules as code

4.16 ‘Rules as code’ is the translation of legislation into a machine readable format. Natural language legislation or rules are translated into a version that can be understood and interpreted directly as code by a computer.[25]

4.17 Rules as code use expert systems that rely on pre-defined rules.[26] It enables the creation of websites that allow people to ask questions about legislation, including through a chatbot.

4.18 Rules as code could support AI-assisted automated decision-making by government agencies.[27] This could be seen in the automated issuing of licences, where:

• a person submits a licence application on an online platform

• an AI automated decision-making tool applies the rules as code to the individual’s situation

• the AI system decides whether to issue a licence or not.

4.19 A rules as code project was implemented in New South Wales when a digital version of the Community Gaming Regulation 2020 (NSW) was created. The state government published versions of the regulation readable by humans and readable by machines.[28] This allowed people to use an online questionnaire to understand how the regulation applied to their situation.[29]

4.20 Austlii’s Datalex platform provides software tools for users to develop rules as code applications.[30]

Opportunities

Risks

• Develop programs that assist people to understand the law.

• Enhance the public’s access to primary legal sources, as well as expert legal commentary.

• Interpret and translate complex legal language into plain English.

• Machine-readable legal texts may be protected by proprietary laws and may not be accessible to the public.

• Negative impacts for transparency and trustworthiness if reasoning and conclusions of AI tools are not explainable.

Drafting affidavits, statements, orders and other documents

4.21 AI can be used to generate legal documents. Large language models that use generative AI and machine learning can be used to draft submissions, statements, judgments and orders.[31]

4.22 AI drafting tools could assist parties, legal professionals, prosecutorial bodies and judicial members. Some AI tools may include suggestions for improvements to legal language, clarity and precision, and over time, compliance with legal norms. Others may generate initial drafts based on predefined templates or specific legal criteria.

4.23 Some generative AI tools can streamline drafting of legal documents by generating standardised contracts, reviewing clauses for potential errors and ensuring that they comply with legal standards.[32] Future tools may facilitate more collaborative, personalised legal documents tailored to client requirements and needs.[33]

4.24 AI tools that use natural language processing and speech recognition could assist self-represented litigants to express evidence and submissions more clearly.[34] But issues around inaccuracy remain significant, particularly for non-legal court users who may not be able to check whether AI outputs are correct.

4.25 Another issue is that AI drafting tools could result in a significant rise in vexatious litigation and the generation of too much material without a strong basis in law.

4.26 There are major risks with using generative AI to prepare court documents. Generative AI models depend on sophisticated statistical calculations and large data sets to predict what text might be used in a response. But they cannot verify or understand if an output is true.

4.27 Generative AI tools may distort the meaning or tone of a witness statement. Warnings about the use of large language models in evidential submissions emerged in Director of Public Prosecutions v Khan.[35] Justice Mossop held that little weight could be placed on a character reference from the offender’s brother because the reference appeared to be generated by an AI program such as ChatGPT.[36]

Opportunities

Risks

• Increased efficiency and productivity of judicial members and legal professionals.

• Increased speed and accuracy of drafting legal documents.

• Automatic review of draft legal documents for compliance with legal standards and regulations.

• Increased workload for courts and tribunals if self-represented litigants submit AI drafted materials without being able to identify legal inaccuracies.

• Increased workload for courts if AI tools are used by vexatious litigants to initiate matters without a legal basis.

• Privacy and data concerns if sensitive client data is entered into AI drafting tools.

Predicting outcomes of decisions

4.28 Predictive analytics that rely on AI models are said to be able to predict the outcome of legal cases. Lex Machina reportedly performed better than experienced lawyers and scholars at predicting outcomes of the US Supreme Court.[37] Predictive analytics have been used in Australia. Macquarie University designed a program to analyse judicial decision-making patterns in migration cases heard in the Federal Circuit Court.[38] But these tools have not been developed or used widely in Australia compared to other jurisdictions.[39]

4.29 Traditionally, predictive analytics used expert systems to identify patterns. They developed to incorporate machine learning to predict the likely outcome of cases with reportedly good accuracy.[40] But inaccuracy remains an issue, particularly of data. Predictive analytics tools may be less accurate when they are trained on specific local data, which is then applied to a broader set of facts or to another jurisdiction.[41]

4.30 AI tools that predict potential case outcomes in general are different from tools that predict specific outcomes related to a particular judge. General court analytics can provide litigants with broad expectations about the likely outcome of mediation or legal action. In contrast, ‘judicial analytics’ look at the history of individual judges to make predictions about their future behaviour.

4.31 Predictive analytics can provide litigants with information to help them decide whether to bring cases.[42] But these tools come with risks. Critics argue they use ‘data and patterns that ignore legal precedent and the specifics of individual cases.’[43] This data can then influence courtroom tactics, legal arguments and possibly even judicial outcomes if litigants withdraw or settle claims based on predictions.[44]

4.32 France banned the use of predictive ‘judicial analytics’ in 2019 imposing a criminal penalty of up to five years in prison.[45] France’s Constitutional Council said the reform would prevent judicial profiling which could ‘lead to undesirable pressures on judicial decision-making and strategic behaviour by litigants’.[46]

4.33 In contrast, the ‘smart court’ system in China has deployed judicial analytics for performance management. In China’s Jiangxi Provincial High Court the Trial e-Management Platform system ensures a judge’s activities can be measured and compared to their peers.[47] Judges are encouraged to use judicial analytics to strengthen procedural fairness.

Opportunities

Risks

• Predictive analytics can help litigants decide whether to bring cases, avoid unnecessary legal costs or assist in resolving matters early by providing insight into the outcomes of prospective legal action.

• Judicial analytics may be used by litigants for ‘court shopping’, leading to interference with judicial independence.

• Judicial analytics may influence legal arguments and litigation outcomes.

• Judicial analytics tools could be used to monitor and regulate the performance of judicial members which may interfere with judicial independence.

Virtual legal advice

4.34 AI may improve access to justice by increasing the accessibility and reducing the cost of legal advice. Chatbots and virtual legal assistants can provide access to ‘basic legal guidance, answer common legal questions, and direct individuals to relevant resources’.[48] People can access virtual services in their own homes, at a time that suits them. In some jurisdictions AI products claim to provide lower cost, more convenient virtual legal options for people who cannot afford traditional legal representation.[49]

4.35 Reliance on generative AI legal advice could diminish the perceived need for professional legal assistance. This could deprive court users of the human judgement that legal professionals provide. AI may in future replace some mechanistic aspects of a lawyer’s role.[50] But critics argue that AI cannot replace lawyers’ professional judgement and their ability to apply humanity and ethics to decision-making[51] (see Part A). Also, AI systems are not bound by the legal and ethical obligations that lawyers are.[52]

4.36 The use of AI to provide legal advice raises issues of unlawful and unqualified legal practice. An entity providing unqualified legal practice is a criminal offence in Victoria.[53] Lawyers must hold a valid practicing certificate to be licenced to engage in legal practice.[54] AI tools cannot hold a practicing certificate.[55] This raises questions as to who is responsible for virtual legal advice and what people who rely on it can do when things go wrong.[56]

4.37 Another issue with AI legal advice is the risk of inaccuracy. Emerging case law highlights instances where generative AI tools have produced mistaken legal interpretations. This risk is worse for self-represented litigants who rely on AI legal advice without understanding its limitations. This may result in misguided legal actions.

Opportunities

Risks

• Improved access to timely legal advice for free or at lower expense.

• Increased access to justice, as publicly funded AI tools can provide universal access to basic legal information and improve access to legal information for socially disadvantaged and marginalised communities.

• Potential to improve the clarity and argument of self-represented litigant applications.

• Self-represented litigants rely on automated legal advice without fully understanding their limitations and commence misguided legal actions.

• Increased burden on the court system, as judges and court staff spend additional time correcting errors and ensuring fairness.

• Provision of unqualified and unlawful legal advice.

In-court hearings

Table 2: Current and potential uses of AI for in-court hearings

Court users

Courts and Tribunals

Uses

Public

SRLs

Legal professionals

Prosecutorial bodies

Court administrators

Judges/ Tribunal members

Juries

Technology assisted review and e-Discovery

Transcription and translation

Evidence

Key

User: Uses an AI system to produce a product or service

Host: Delivers and maintains the AI system

Receiver: Relies on material or services produced by AI

Technology assisted review and e-Discovery

4.38 Discovery is a pre-trial process where a party to a proceeding discloses and makes available relevant documents to other parties. Discovery is critical to support an ‘informed assessment of the risks and prospects faced by parties involved in civil litigation.’[57] In Victoria, parties are obliged to conduct a ‘reasonable search’ for documents relevant to the issues in dispute.[58] The increase in electronic documents, like email, means that the scope of potentially discoverable documents is much greater. Manual discovery is therefore much more time-consuming and expensive.[59]

4.39 ‘Technology assisted review’ is a process that uses computer software to electronically classify documents. It has been used to undertake large-scale document discovery (such as e-Discovery). Technology assisted review initially used predictive coding, with more sophisticated models employing machine learning for large-scale document review by identifying patterns in textual data. Using AI on a smaller set of documents, technology assisted review applies the patterns to larger document collections. Technology assisted review can use different levels of human involvement in fine-tuning a program’s training, ranging from supervised to unsupervised machine learning.[60] There are several commercial technology assisted review products available in Australia.[61]

4.40 In December 2016, the Supreme Court of Victoria in McConnell Dowell Constructors (Aust) Pty Ltd v Santam Ltd & Ors established a precedent approving the use of technology assisted review in Victoria.[62] The Victorian Supreme Court then developed a practice note stating that legal professionals should educate themselves in the potential use of technology assisted review in the discovery process.[63]

4.41 Technology assisted review has been used to assist in the e-Discovery process in civil matters for years. Similar tools could be used to assist in criminal matters to process large amounts of digital evidence.[64] Criminal defence lawyers have noted an increase in the time and resources required to process digital evidence, including data from:

• mobile phones

• computers

• web searches

• CCTV

• body-worn camera and dash camera footage.[65]

4.42 Some criminal law firms use AI tools to process large amounts of digital evidence. They use AI to identify ‘connections between vast amounts of text, including emails, text messages, other documents and records, as well as phone transcripts’.[66] It is likely these tools will play an increasing role in the management of digital evidence in criminal matters.

Opportunities

Risks

• Technology assisted review can reduce the time and effort required to discover documents, allowing legal professionals to focus on more strategic tasks.

• E-Discovery tools can use data analytics to highlight key facts in a case or identify key witnesses.

• Poor quality training data may lead to inaccurate results from the technology assisted review software, undermining the reliability of the review process.

• Handling sensitive and confidential information through technology assisted review software poses privacy and security risks.

Transcription and translation

4.43 Natural language processing refers to the way AI systems acquire, process and interpret natural language. This includes deep learning methods to enable the conversion of audio to text or to translate one language to another.[67]

4.44 Recently, translation and transcription tools have advanced significantly. They have improved in speed, accuracy and reliability. A competitive industry exists based on voice recognition and transcription. But there is concern about inaccuracy and whether AI systems can contextualise information. Some claim there is an ongoing need for human oversight in the form of editing and reviewing information.[68]

4.45 Commercial applications have been developed for transcription and translation tools in court. There could be more extensive use of real-time voice recognition, transcription and translation services in Victorian courts and tribunals.[69]

4.46 In Canada, court guidance on AI use flagged that the courts’ technology committee will pilot potential uses of AI for internal administration. This includes AI translating court decisions, with results to be reviewed for accuracy by a translator.[70]

4.47 In India, the Supreme Court uses AI to translate decisions and orders into nine local languages.[71] This increases accessibility for people who do not speak English.[72] India’s Tis Hazari Court recently announced a pilot of the use of speech-to-text facilities in court hearings.[73] This aims to streamline the recording of evidence and address case delays due to the lack of court stenographers.[74]

4.48 The New Zealand Chief Justice has announced plans to investigate and pursue ‘automated speech to text hearing transcription’ and ‘automated interpretation’ services in New Zealand’s courts and tribunals.[75]

Opportunities

Risks

• Transcribe court proceedings in real-time, reducing the time needed to produce transcripts and providing court users with faster access to court records.

• Translate court documents into other languages and provide support to people with hearing impairments.

• Speech recognition software may not always accurately capture spoken words, especially where there is background noise, technical jargon or unfamiliar accents.

• The use of AI in court transcription may replace aspects of human jobs.

Evidence

4.49 Evidence before courts will increasingly involve AI. In addition to the use of AI to develop expert reports, AI might be used in evidence in other ways. For example:

• predictive models of future events, including risk assessment

• analysis of forensic evidence and other expert evidence

• audio and visual evidence produced or altered by generative AI (records of interview, body camera or dash camera footage). Generative AI also introduces the risk of ‘deepfake’ evidence.

4.50 AI, and evidence concerning AI, may also be of particular relevance to certain causes of action (the set of facts that give rise to legal action). For example, this may include:

• breach of privacy

• discrimination

• copyright infringement

• malicious uses, such as defamation

• cyber breaches

• employment-related issues.[76]

4.51 AI could also give rise to criminal proceedings, such as deepfake offences.[77]

Reducing vicarious trauma through review of evidence

4.52 AI may improve court and tribunal staff’s wellbeing by reducing the need to review traumatic material. Vicarious trauma can occur by exposure to graphic material such as visual images and evidence of violence, including offences against children and sexual assault.[78]

4.53 Vicarious trauma is a significant issue for courts, with a study finding that three-quarters of judicial officers suffered negative effects from vicarious trauma.[79] This is also an issue for court support staff and other professionals interacting with the court system.

4.54 AI tools can be used to reduce vicarious trauma. For example, some law enforcement agencies involved in child sexual abuse cases utilise ‘hash matching’ technology. This technology uses predictive algorithms and machine learning to assign images and videos a unique digital signature.[80] This can automate the processing of documents, so that human staff members are less exposed to disturbing images.[81]

4.55 AI systems such as ‘computer vision’, draw information from images, videos and other visual input, can replace visual images with text. This could reduce exposure to harmful and distressing imagery.

Forensic evidence

4.56 AI is changing how forensic experts involved in criminal investigations work. AI can be used for data aggregation, analytics and pattern recognition.

4.57 AI uses have been identified in forensic fields such as anthropology, odontology, pathology and genetics.[82] Experts in these fields have started to investigate how AI tools can be integrated to analyse data and identify patterns in images and videos.[83]

4.58 AI can support digital forensics, which focuses on evidence found on digital devices or related to computer-based crime.[84] Digital evidence analysis involves data such as surveillance footage, financial transactions, emails and social media.

4.59 Advances in deep neural networks have highlighted the potential for computer vision applications to support forensic experts involved in facial recognition.[85] There is also potential for generative AI to assist in crime scene reconstructions based on machine learning models that combine physical evidence and eyewitness accounts.[86]

4.60 There are risks with the forensic application of AI tools. Racial biases have been identified in the use of computer vision technologies.[87] Robust AI management systems are critical to ensure proper testing and assurance of AI systems, to eliminate inaccuracies and biases.

4.61 These developments may spill over into other areas of law, such as the application of AI in medical and actuarial evidence used in personal injury cases.

Expert evidence

4.62 AI will likely be used in the preparation and delivery of expert evidence. Experts may use large language models to prepare evidence for courts and tribunals. This may lead to inaccuracies in court reports and submissions. It was recently reported that Australian academic experts unintentionally relied on fake case studies generated by Google Bard in their submission to a Federal parliamentary inquiry.[88]

4.63 An increase in matters involving AI will also require courts and tribunals to engage with AI experts such as computer scientists on the technology underlying AI systems.

Deepfake evidence and offences

4.64 Generative AI can be used to create ‘deepfake’ materials including text, audio, photos and videos.[89] A deepfake is any form of media ‘that has been altered or entirely or partially created from scratch’.[90] Deepfakes raise issues for courts as they may be submitted as evidence and are difficult to detect.[91] See Part A for further discussion of deepfakes.

4.65 Australia recently created new criminal offences relating to making and distributing sexually explicit deepfake images and videos.[92] Similar offences have been introduced in the United Kingdom.[93]

4.66 Prosecutors, counsel and judicial officers will face difficult questions as they decide whether experts are competent to explain whether evidence tendered in court is legitimate or fabricated. Juries, too, may need to make decisions based upon such expert advice.

Other uses by prosecutorial bodies

4.67 Prosecutorial bodies, which make decisions about who enters the criminal justice system, are using AI. The Victorian Office of Public Prosecution works in collaboration with law enforcement agencies, including Victoria Police.

4.68 AI is playing a growing role in criminal investigations and policing. AI facial recognition tools are increasingly used by police to identify individuals, including in criminal investigations. The developers claim these tools can process and compare thousands of faces in seconds, reducing the time and resources needed to identify victims and suspects.[94] But facial recognition tools have faced criticism and have been the subject of legal action for violating privacy laws. Australian authorities have raised concerns where companies have collected and used images without individuals’ consent.[95] Australian researchers have called for specific laws to regulate facial recognition technology.[96]

4.69 In response to the growing role of AI in policing, Victoria Police introduced the Victoria Police Artificial Intelligence Ethics Framework.[97] It aims to ensure the use of AI by Victoria Police is ethical and consistent with its existing obligations including:

• Victoria Police’s Organisational Values

• Code of Conduct

• Charter of Human Rights and Responsibilities Act 2006 (Vic).[98]

4.70 The Ethics Framework reflects Victoria Police’s commitment to implementing the Australia New Zealand Policing Advisory Agency’s AI principles (see Part D).[99]

4.71 AI is likely to have an increasing role in sorting and analysing large amounts of digital evidence in criminal cases.[100] This includes phone records and body-worn camera footage.[101] AI tools may reduce the time and resources needed to analyse digital evidence, and prosecutors may be expected to do this. It has been argued that ‘the Crown has a role to play in ensuring any digital evidence is provided in a way that makes it easy to process and analyse’.[102]

4.72 AI will increasingly be used in forensic and expert evidence, including by prosecutorial bodies and within the coronial system. Speech recognition could prepare records of interview with suspects and witnesses. It may also be used to prepare victim impact statements through the transcription of verbal testimony. Generative AI tools could assist prosecutorial bodies by preparing witness statements automatically,[103] or assist law enforcement through suspect sketching, licence plate searches and automated report preparation.[104] Natural language processing tools such as chatbots could use cognitive interview techniques that help witnesses to recall more accurately,[105] and may deploy interview techniques less likely to lead a subject.[106]

4.73 AI technology is likely to be adopted by Victoria’s regulators to improve regulatory practice. This may include using AI tools to monitor compliance and to support enforcement in collecting evidence or preparing court materials.

4.74 As of 2021, there were 62 regulators in Victoria with compliance and enforcement functions, some of which can commence criminal prosecutions.[107] The Environment Protection Authority can pursue sanctions against government, businesses or individuals, including issuing infringement notices and commencing civil proceedings in court.[108] It can commence a criminal prosecution in some circumstances.[109] The Environment Protection Authority’s Strategic Plan 2022-27 recognises the need to take advantage of new technologies including AI to improve its regulatory reach and effectiveness.[110]

Opportunities

Risks

• AI excels at pattern recognition, potentially resulting in increased accuracy and reliability of forensic evidence, digital evidence and evidence related to other areas of law, such as medical and actuarial evidence used in personal injury cases.

• Improvements to the quality and efficiency of prosecutorial bodies and coronial system processes, including through use of speech recognition technology used to prepare records of interview.

• Reduction in the need for court and tribunal staff to review graphic evidence, including offences against children and sexual assault.

• Increases in the efficiency and effectiveness of compliance and enforcement measures by Victoria’s regulators.

• The increasing misuse of deepfake evidence in court cases as well as increasing challenges to the validity of legitimate evidence.

• Unfair or discriminatory outcomes if biases present in training data are not altered, particularly problematic in legal contexts where impartiality is crucial.

• Challenges for expert witnesses, judges and juries in understanding or explaining how AI tools have produced specific outputs.

• Public concerns with procedural fairness where AI is used in regulatory compliance and enforcement.

Alternatives to in-court hearings

Table 3: Current and potential uses of AI as alternatives to in-court hearings

Court users

Courts and Tribunals

Uses

Public

SRLs

Legal professionals

Prosecutorial bodies

Court administrators

Judges/ Tribunal members

Juries

Online alternative dispute resolution

Online dispute resolution within courts and tribunals

Virtual courts and tribunals

Key

User: Uses an AI system to produce a product or service

Host: Delivers and maintains the AI system

Receiver: Relies on material or services produced by AI

Online dispute resolution

4.75 Online dispute resolution is a broad term referring to the use of the internet and technology to help resolve disputes. It can be applied to:

• online alternative dispute resolution outside of courts

• processes within courts and tribunals.[111]

4.76 Online dispute resolution systems have given rise to innovative and interactive applications of AI. These systems can reduce the administrative burden for courts and the financial burden for parties. Their use is discussed in the following sections.

Online alternative dispute resolution

4.77 Online alternative dispute resolution refers to dispute resolution processes outside of courts. It can be administered privately or publicly. The aim is to resolve disputes through technology, using negotiation, mediation and arbitration.[112]

4.78 Online alternative dispute resolution was developed in response to the growth of e-commerce. It was used to deal with ‘high-volume, low-value, consumer disputes arising from online transactions on e-commerce websites such as Amazon, eBay and PayPal’.[113] It allows geographically distant parties to resolve disputes online, at a low cost, avoiding the need to go to court.[114] These online platforms rely on AI to adjudicate disputes. They have been used to resolve a very large number of disputes. In 2011, eBay’s AI-assisted online dispute resolution system resolved 60 million disputes.[115] The use of online alternative dispute resolution systems has been adopted from private commercial disputes into public dispute systems.[116]

4.79 Online alternative dispute resolution previously used expert systems, relying on rules encoded by humans. These systems have developed to use algorithms and machine learning AI technology, which has enabled more advanced mediation and decision-making. Michael Legg gives the example of ‘blind bidding’ systems which use multivariate algorithms to calculate an optimal solution based on how the disputants have ranked their key issues.[117]

4.80 AI-integrated online alternative dispute resolution tools have been used in a public setting in Australia. Amica is a free online alternative dispute resolution platform designed to help separating couples reach agreement on financial arrangements.[118] It uses AI to guide parties to a resolution without the need for lawyers or courts, by providing:

a suggested division of property based on an analysis of their circumstances; agreements reached by other separating couples in similar situations; and how courts generally handle disputes of the same nature.[119]

Online dispute resolution within courts and tribunals

4.81 Internationally, online dispute resolution techniques have been integrated into existing court and tribunal processes. They can support parties in different ways, from information gathering, obtaining advice, direct and supported negotiation through to adjudication.[120]

4.82 British Columbia’s Civil Resolution Tribunal is an AI-assisted online alternative dispute resolution system.[121] It has four phases:

1. A purpose-built expert system, Solution Explorer, asks parties questions to understand the legal claim, classify and narrow the matters in dispute, and provide tailored legal information and appropriate forms.

2. An automated negotiation tool is used to support interparty communication and prepare draft agreements.

3. A facilitation phase is undertaken with an expert facilitator to help parties reach a consensual agreement.

4. If parties are still unable to reach agreement the matter proceeds to adjudication by a Tribunal Member.[122]

4.83 One study of the Civil Resolution Tribunal reported high levels of user satisfaction.[123] From beginning to end the process takes an average of 90 days to resolve a matter.[124]

4.84 The Singaporean judiciary recently partnered with Harvey AI to develop a generative AI tool to assist self-represented litigants. The tool aims to support litigants in small claims to navigate the legal process, and potentially assist Small Claims Tribunal magistrates in examining evidence.[125] The Harvey AI tool will use generative AI to provide automated pre-court advice and assistance with court processes, including the auto-filling of forms. It may also advise on the likely outcomes of a claim and prompt parties to reach a settlement or consider mediation.[126]

Virtual courts and tribunals

4.85 Virtual or remote courtrooms allow judges, tribunal members, court staff, lawyers and witnesses to participate through video conferencing. The hearing is conducted virtually, including the giving of evidence and the making of oral submissions.

4.86 Virtual courts do not generally involve AI. But the development of the ‘Smart Court’ system in China shows how AI can be used in virtual court platforms. The Smart Court system has integrated AI technology to confirm litigant identity through facial recognition and provide enhanced voice-to-text approaches in court proceedings.[127]

4.87 In Australia, the COVID-19 pandemic accelerated the use of technology in courts. In-person hearings were replaced with virtual hearings, civil trials and conferences.[128]

4.88 Australian courts and tribunals have also adopted online case management systems.[129] In New South Wales an online court has operated since 2018, providing case management for select local court matters.[130] So far, virtual courtrooms and case management systems in Australia have not evolved to incorporate AI. But the advances in AI courtroom technology in China show how AI may be used in the context of online courts and tribunals globally.

Opportunities

Risks

• Online alternative dispute resolution systems can guide parties to a resolution without the intervention of lawyers or courts.

• The addition of natural language processing-based AI tools may assist with high-volume, low complexity legal matters, including by simplifying the legal process for court users and assisting in the drafting of submissions or forms.

• Virtual court forums have heightened vulnerability to deepfake technology uses, such as through voice cloning in ID verification.

Supporting judicial decision-making

Table 4: Current and potential uses of AI in judicial decision-making

Court users

Courts and Tribunals

Uses

Public

SRLs

Legal professionals

Prosecutorial bodies

Court administrators

Judges/ Tribunal members

Juries

Case summaries

Risk assessment, bail and criminal sentencing

Automated decision-making

Key

User: Uses an AI system to produce a product or service

Host: Delivers and maintains the AI system

Receiver: Relies on material or services produced by AI

4.89 The potential uses of AI to support judges and tribunal members in their role as decision-makers is discussed in the section below.

4.90 Advances in AI over the last decade have led to the development of a wide selection of tools that might support judicial decision-making. These tools are being used in some jurisdictions.[131] These tools can automate routine tasks and functions that save judicial officers’ time. They can also provide context or technical advice for judicial decision-making.

4.91 But AI may be used in ways that have significant consequences for fundamental principles of justice and raise serious ethical dilemmas. Such uses include using AI in a way that influences or removes judicial discretion, for example, in offender risk assessments or criminal sentencing. These risks and the implications for principles of justice are discussed in Parts A and C and are further addressed below.

Preparation and review of pre- and post-hearing summaries

4.92 Generative AI tools could assist court staff in the preparation of case summaries. Internationally, a summary of a case is sometimes prepared for judicial members when receiving a new case for hearing. In England and Wales, Lord Justice Birss highlighted the potential use of AI in providing case summaries, which he saw as useful in assisting a judge to get across a case more quickly.[132]

4.93 AI could assist in identifying key points and arguments by automatically summarising information. Some tools claim to provide capabilities that could enable judges or tribunal members to review and scrutinise summaries prepared with that technology.[133]

Opportunities

Risks

• Assistance to judicial members in the preparation of pre- and post- hearing summaries, saving time and reducing errors.

• Indirectly shaping judicial decision-making where automated case summaries do not include all relevant facts, or place emphasis on immaterial issues.

Risk assessment, bail and criminal sentencing

4.94 AI might be used to inform decisions about bail and sentencing.

4.95 Predictive tools that assist with risk assessment and criminal sentencing are common in the United States. The Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) is an algorithm used to conduct assessments of offenders at risk of reoffending. It draws on historical case data and the offender’s conduct and background.[134]

4.96 In State of Wisconsin v Loomis Justice Bradley found the use of COMPAS was permissible for certain aspects of sentencing if the judge was notified of the tool’s limitations and if the judge made the final sentence determination.[135] This case created a procedural safeguard by requiring a written advisement with any pre-sentence report that included a COMPAS risk assessment.[136] This advisement must include the limitations and risks associated with COMPAS. Some have argued this is an ineffective precaution as it does not factor in a judge’s competency to evaluate the quality of the outcome.[137]

4.97 The Loomis case is also significant because of the proprietary nature of the AI technology underpinning COMPAS. The Wisconsin Supreme Court found that despite the methodology of COMPAS not being available to the court or the defendant, as it was a trade secret, the use of COMPAS in sentencing did not violate the defendant’s right to due process.[138] This case illustrates the ‘black box problem’ for more advanced AI tools, where it may not be technically possible to determine how the algorithm reached a specific result.

4.98 There are significant risks of bias associated with AI predictive analytic tools. An analysis of COMPAS by ProPublica found the tool had systematised racial bias against African Americans.[139] African American defendants were more likely to be flagged as ‘high risk’, despite not in fact being high risk, compared to white defendants.[140]

4.99 Evidence suggests decision-makers struggle to determine if or when they should diverge from an algorithmic recommendation.[141] Ben Green argues that policymakers should not rely on individual ‘human oversight’ to mitigate the potential harms of algorithms. Green argues ‘institutional oversight’ is needed where institutions outline the rationale for specific technologies.[142]

4.100 AI predictive analytic tools have also been used to influence decision-making by prosecutorial bodies. In London in 2022, the Metropolitan Police faced legal action for the use of the Gangs Violence Matrix, a predictive analytics tool used to forecast gang related violence in London.[143] This system produced a risk rating which informed whether police would exercise their stop and search powers. The tool was ultimately found to be unlawful as it was racially discriminatory and contravened human rights.[144]

4.101 Generative AI has also been used to provide information to inform bail decisions. The Punjab and Haryana High Court relied on ChatGPT to support a determination around bail in 2023. Justice Chitkara disclosed the use of ChatGPT and the purpose of its use, explaining that it was used to provide context about bail jurisprudence rather than for matters of fact or relating to the merits of the case.[145]

Opportunities

Risks

• Support for judicial decision-making where predictive analytics tools provide accurate and reliable statistically based risk assessments for bail and criminal sentencing outcomes.

• Outcomes that are not explainable if decisions are informed by AI models that are opaque.

• Existing human biases are perpetuated if AI systems are not adjusted in relation to historical case data.

Judicial review of administrative decisions

4.102 The rise of AI systems to support or fully automate administrative decisions will impact courts and tribunals. It is likely to result in a significant increase in the judicial review of administrative decisions made using AI systems.

4.103 In New South Wales a survey showed 136 automated decision-making systems were reportedly in use, being piloted or planned for use within three years across 77 government departments and agencies.[146] This survey found the most common use or planned use of AI was for compliance functions.[147] Some organisations also reported using automated decision-making systems for adjudication and justice.[148] The types of systems included fully automated systems, rules-based systems, structured decision-making tools, risk assessment tools and natural language processing tools (chatbots and large language models). Automated decision-making systems were more often used to support, rather than replace, human decision-making.[149]

Automated decision-making

4.104 AI systems can potentially be designed to replace judicial discretion through automated decision-making. But fully automating decisions raises significant ethical and legal issues.

4.105 In other jurisdictions, like the European Union, the use of AI in judicial decision-making has been categorised as high-risk and protected as a ‘human driven activity’.[150] The EU AI Act 2023 clarifies that ‘high-risk’ does not include ‘ancillary administrative activities…’ that do not affect the ‘…actual administration of justice in individual cases’.[151]

4.106 The risks and benefits vary significantly depending on how automated decision-making might be used in courts and tribunals. It may be suitable for less complex, routine decisions that do not involve much discretion, or in areas that relate to classification. Complex issues that involve discretion or legal reasoning are far less likely to be suitable for automated decision-making. Similarly, criminal matters involving a person’s liberty may be less suitable than civil or procedural matters. As Tania Sourdin notes, ‘there are some opportunities for AI processes to support judges and potentially supplant them. Initially, however, the impacts are likely to be confined to lower level decision-making’.[152]

4.107 International case studies provide insight into the risks and benefits attributed to automated decision-making. In Mexico, the EXPERTIUS system advises judicial members on specific administrative matters, such as whether a plaintiff is eligible to receive social benefits.[153]

4.108 In China, the ‘Smart Court’ system has integrated AI, including to automate judgments in some cases.[154] This may provide efficiency benefits. But it raises the risk of automated drafting tools influencing judicial independence (as discussed in Part C).

4.109 In other jurisdictions, the potential for automated decision-making to displace judicial decision-making has been too contentious to implement. In 2019, Estonia’s Chief Data Officer reportedly outlined a plan to use AI to automate adjudication of small claim disputes. This drew international attention about the potential replacement of judges by AI.[155] The Estonian Ministry of Justice later clarified it was not seeking to replace human judges and its focus was on automating procedural steps.[156] It has since outlined a range of ways that AI systems support court procedures, including text translation, transcription, facial recognition for identity verification, anonymisation and de-identification of court records.[157]

4.110 Other countries have expressly ruled out possible uses of automated judicial decision-making for now. In July 2020, Canada’s Federal Court Strategic Plan stated that:

At this point in time, AI is not being considered to assist with the adjudication of contested disputes. Rather, the Court is exploring how AI may assist it to streamline certain of its processes (e.g., the completion of online ‘smart forms’) and may be a potential aid in mediation and other types of alternative dispute resolution.[158]

Questions

5. How is AI being used by:

a. Victorian courts and tribunals

b. legal professionals in the way they interact with Victorian courts and tribunals

c. the public including court users, self-represented litigants and witnesses?

6. Are there uses of AI that should be considered high-risk, including in:

a. court and tribunal administration and pre-hearing processes

b. civil claims

c. criminal matters

How can courts and tribunals manage those risks?

7. Should some AI uses be prohibited at this stage?


  1. Heidi Alexander, ‘Easy Automation’ (2019) 45(4) Law Practice 32, 37.

  2. Nintex, ‘Revitalizing Public Service with Process Automation: A Case Study on County Court of Victoria’, IoT ONE (Web Page) <https://www.iotone.com/case-study/revitalizing-public-service-with-process-automation-a-case-study-on-county-court-of-victoria/c6934>.

  3. Victorian Civil and Administrative Tribunal, VCAT Residential Tenancies Hub – Login (Web Page) <https://online.vcat.vic.gov.au/vol/common/login.jsp>; Notably VCAT Online began as the first electronic filing system implemented in Australia see Marco Fabri and Giampiero Lupo, Judicial Electronic Data Interchange in Europe: Applications, Policies and Trends (Report, Research Institute on Judicial Systems of the National Research Council of Italy (IRSIG – CNR), 2003) 86; Marco Fabri and Giampiero Lupo, Some European and Australian E-Justice Services (Report, Research Institute on Judicial Systems of the National Research Council of Italy (IRSIG – CNR), 19 October 2012) 20 <https://www.cyberjustice.ca/en/publications/some-european-and-australian-e-justice-service/>.

  4. Operating with the General List in the District Court of NSW since October 2018, the NSW Online Court is active in selected local court matters, the Corporations and Equity General Lists of the Supreme Court, the General List of the District Court at Sydney and the Land and Environment Court NSW Government, ‘NSW Online Court’, NSW Government, Communities and Justice (Web Page, 3 December 2023) <https://courts.nsw.gov.au/courts-and-tribunals/going-to-court/online-services/online-court.html>. Another example is the Common Platform which is a case management system used in all criminal courts in England and Wales and allows parties to serve documents and access self-service case materials ‘HMCTS Common Platform: Registration for Defence Professionals’, GOV.UK (Web Page, 10 June 2024) <https://www.gov.uk/guidance/hmcts-common-platform-registration-for-defence-professionals>.

  5. Felicity Bell et al, AI Decision-Making and the Courts: A Guide for Judges, Tribunal Members and Court Administrators (Report, Australasian Institute of Judicial Administration Incorporated, December 2023) 35–6.

  6. For instance, IntellidactAI is used in the US court system to process e-Filed documents. Computing System Innovations, Intellidact AI Data Redaction (Web Page, 2022) <https://csisoft.com/intellidact/>.

  7. Fausto Martin De Sanctis, ‘Artificial Intelligence and Innovation in Brazilian Justice’ (2021) 59(1) International Annals of Criminology 1, 2–3 <https://www.cambridge.org/core/product/identifier/S0003445221000040/type/journal_article>; Pedro Henrique Luz de Araujo et al, ‘VICTOR: A Dataset for Brazilian Legal Documents Classification’ in Nicoletta Calzolari et al (eds), Proceedings of the Twelfth Language Resources and Evaluation Conference (Conference Paper, LREC 2020, 11-16 May 2020) 1449, 1450 <https://aclanthology.org/2020.lrec-1.181> Appeals heard by the Brazilian Supreme Court must meet the jurisdictional requirements set out through Constitutional Amendment 45/04. Commonly in Brazilian jurisprudence, this is referred to as a ‘topic of general repercussion’.

  8. Fausto Martin De Sanctis, ‘Artificial Intelligence and Innovation in Brazilian Justice’ (2021) 59(1) International Annals of Criminology 1, 2–3 <https://www.cambridge.org/core/product/identifier/S0003445221000040/type/journal_article>; Pedro Henrique Luz de Araujo et al, ‘VICTOR: A Dataset for Brazilian Legal Documents Classification’ in Nicoletta Calzolari et al (eds), Proceedings of the Twelfth Language Resources and Evaluation Conference (Conference Paper, LREC 2020, 11-16 May 2020) 1449, 1450 <https://aclanthology.org/2020.lrec-1.181> Appeals heard by the Brazilian Supreme Court must meet the jurisdictional requirements set out through Constitutional Amendment 45/04. Commonly in Brazilian jurisprudence, this is referred to as a ‘topic of general repercussion’.

  9. As an example the legal research tool ROSS is claimed to be able to ‘sift through over a billion text documents a second and return the exact passage the user needs’ D Farrands, ‘Artificial Intelligence and Litigation – Future Possibilities’, Handbook for Judicial Officers – Artificial Intelligence (Judicial Commission of New South Wales, September 2022) <https://www.judcom.nsw.gov.au/publications/benchbks/judicial_officers/artificial_intelligence_and_litigation.html>.

  10. See for example LexisNexis, Lexis+ AI: Conversational Search Platform (Web Page) <https://www.lexisnexis.com.au/en/products-and-services/lexis-plus-ai>.

  11. D Farrands, ‘Artificial Intelligence and Litigation – Future Possibilities’, Handbook for Judicial Officers – Artificial Intelligence (Judicial Commission of New South Wales, September 2022) <https://www.judcom.nsw.gov.au/publications/benchbks/judicial_officers/artificial_intelligence_and_litigation.html>.

  12. Varun Magesh et al, ‘Hallucination-Free? Assessing the Reliability of Leading AI Legal Research Tools’ (arXiv, 2024) <https://arxiv.org/abs/2405.20362>.

  13. See Chief Justice Andrew Bell, ‘Truth Decay and Its Implications: An Australian Perspective’ (Speech, 4th Judicial Roundtable, Durham University, 23 April 2024) <https://supremecourt.nsw.gov.au/supreme-court-home/about-us/speeches/chief-justice.html>. In terms of specific cases, see Mata v Avianca, Inc 678 F.Supp.3d 443 (2023), In Mata v Avianca Inc a federal judge imposed fines on a legal firm where lawyers blamed ChatGPT for their submission of fictitious legal research about an aviation injury. See also People v Zachariah C Crabhill 23PDJ067; Park v Kim 91 F.4th 610 (2024); In Re: Thomas G Neusom, Esq Respondent [2024] MD Fla 2:23-cv-00503-JLB-NPM; Zhang v Chen [2024] BCSC 285.

  14. Lise Embley, Getting Started with a Chatbot (JTC Quick Response Bulletin, National Centre for State Courts Joint Technology Committee, 20 April 2020).

  15. Fan Yang, Jake Goldenfein and Kathy Nickels, GenAI Concepts: Technical, Operational and Regulatory Terms and Concepts for Generative Artificial Intelligence (GenAI) (Report, ARC Centre of Excellence for Automated Decision-Making and Society (ADM+S), and the Office of the Victorian Information Commissioner (OVIC), 2024) 6 <https://apo.org.au/node/327400>.

  16. For example, the United States has developed administrative guidance in the development of court chatbots see Aubrie Souza and Zach Zarnow, Court Chatbots: How to Build a Great Chatbot for Your Court’s Website (Report, National Centre for State Courts, 2024); Lise Embley, Getting Started with a Chatbot (JTC Quick Response Bulletin, National Centre for State Courts Joint Technology Committee, 20 April 2020).

  17. ‘Ottawa County Testing Out New Staffer: CORA the Robot’, Ottawa County, Michigan (Blog Post, 14 November 2018) <https://content.govdelivery.com/accounts/MIOTTAWA/bulletins/21b6ded>.

  18. Ibid.

  19. The next phase for Michigan’s CORA will include translation across 30 languages, instant messaging and automated onsite payment of traffic fines and child support. See Marcus Reinkensmeyer and Raymond Billotte, ‘Artificial Intelligence (AI): Early Court Project Implementations and Emerging Issues’ (August 2019) 34(3) Court Manager: a publication of the National Association for Court Management <https://thecourtmanager.org/articles/artificial-intelligence-ai-early-court-project-implementations-and-emerging-issues/>.

  20. ‘Traffic Division’, Superior Court of California, County of Los Angeles (Web Page, 2024) <https://www.lacourt.org/division/traffic/traffic2.aspx>; See also Jumpei Komoda, ‘Designing AI for Courts’ (2023) 29(3) Richmond Journal of Law & Technology 145.

  21. Ibid.

  22. In the early 2000s, Victorian Legal Aid trialled the GetAid system, which was a rules-based program intended to deliver a test that would assess whether a person should receive legal aid. VLA experts tested and developed this program and GetAid was used in-house before it was discarded after five years see John Zeleznikow, ‘Can Artificial Intelligence and Online Dispute Resolution Enhance Efficiency and Effectiveness in Courts’ (2017) 8(2) International Journal for Court Administration 30, 37–8 <https://iacajournal.org/articles/10.18352/ijca.223>.

  23. ‘Bringing AI to the Legal Help Ecosystem with a Free Licence for NFPs’, Justice Connect (Web Page, 7 March 2023) <https://justiceconnect.org.au/fairmatters/bringing-ai-to-the-legal-help-ecosystem-with-a-free/>.

  24. Ibid.

  25. Felicity Bell et al, AI Decision-Making and the Courts: A Guide for Judges, Tribunal Members and Court Administrators (Report, Australasian Institute of Judicial Administration Incorporated, December 2023) 11.

  26. Ibid.

  27. James Mohun and Alex Roberts, Cracking the Code: Rulemaking for Humans and Machines (OECD Working Papers on Public Governance No 42, 2020) 49 <https://www.oecd-ilibrary.org/governance/cracking-the-code_3afe6ba5-en>.

  28. NSW Government Beyond Digital Team, ‘In an Australian First, NSW Is Translating Rules as Code to Make Compliance Easy’, Digital NSW (Web Page, 21 July 2020) <https://www.digital.nsw.gov.au/article/an-australian-first-nsw-translating-rules-as-code-to-make-compliance-easy>.

  29. Ibid.

  30. Felicity Bell et al, AI Decision-Making and the Courts: A Guide for Judges, Tribunal Members and Court Administrators (Report, Australasian Institute of Judicial Administration, December 2023) 9.

  31. Cary Coglianese, Maura Grossman and Paul W Grimm, ‘AI in the Courts: How Worried Should We Be?’ (2024) 107(3) Judicature 65, 69 <https://judicature.duke.edu/articles/ai-in-the-courts-how-worried-should-we-be/>; Vasiliy A Laptev and Daria R Feyzrakhmanova, ‘Application of Artificial Intelligence in Justice: Current Trends and Future Prospects’ [2024] Human-Centric Intelligent Systems 9 <https://doi.org/10.1007/s44230-024-00074-2>.

  32. For an example of an AI legal drafting tool see Thomson Reuters AI CoCounsel ‘CoCounsel Drafting – Save Valuable Time with CoCounsel Drafting’, Thomson Reuters (Web Page) <https://legal.thomsonreuters.com/en/products/cocounsel-drafting>.

  33. Looplegal, ‘The Future of Online Contract Review: Trends and Predictions’, Medium (Web Page, 15 December 2023) <https://medium.com/@looplegaluk/the-future-of-online-contract-review-trends-and-predictions-c08bea1635b8>.

  34. Youssef v Eckersley [2024] QSC 35. In Youssef v Eckersley & Anor, the Court had no problem with the plaintiff who was a self-representing litigant using ChatGPT to assist in organisational structure and additional flourishes to the submissions, but in part because the plaintiff was upfront with disclosing the purpose and specific uses of the AI tool.

  35. Director of Public Prosecutions (ACT) v Khan [2024] ACTSC 19.

  36. Ibid.

  37. Lyria Bennett Moses, ‘Artificial Intelligence in the Courts, Legal Academia and Legal Practice’ (2017) 91(7) Australian Law Journal 561, 566.

  38. Monika Zalnieriute, Technology and the Courts: Artificial Intelligence and Judicial Impartiality (Submission No. 3 to Australian Law Reform Commission Review of Judicial Impartiality, June 2021) 6 <https://www.alrc.gov.au/wp-content/uploads/2021/06/3-.-Monika-Zalnieriute-Public.pdf>.

  39. Pamela Stewart and Anita Stuhmcke, ‘Judicial Analytics and Australian Courts: A Call for National Ethical Guidelines’ (2020) 45(2) Alternative Law Journal 82, 83 <https://search.informit.org/doi/epdf/10.3316/informit.247278180194024>.

  40. Monika Zalnieriute, Technology and the Courts: Artificial Intelligence and Judicial Impartiality (Submission No. 3 to Australian Law Reform Commission Review of Judicial Impartiality, June 2021) 6 <https://www.alrc.gov.au/wp-content/uploads/2021/06/3-.-Monika-Zalnieriute-Public.pdf>.

  41. This was observed by Chief Justice Gleeson in relation to the small number of High Court cases, as cited in Pamela Stewart and Anita Stuhmcke, ‘Judicial Analytics and Australian Courts: A Call for National Ethical Guidelines’ (2020) 45(2) Alternative Law Journal 82, 84 <https://search.informit.org/doi/epdf/10.3316/informit.247278180194024>.

  42. Ibid.

  43. Ibid 85.

  44. Ibid.

  45. Jena McGill and Amy Salyzyn, ‘Judging by the Numbers: Judicial Analytics, the Justice System and Its Stakeholders’ (2021) 44(1) Dalhousie Law Journal 249, 250 referencing, LOI n° 2019-222 du 23 mars 2019 de programmation 2018–2022 et de réforme pour la justice (1), 24 March 2019, Article 33, online: <https://www.legifrance.gouv.fr/jorf/article_jo/ JORFARTI000038261761?r=Xox7hUcdZ5> as translated by Rebecca Loescher, a professor of French at St. Edward’s University in France, and reported in Jason Tashea “France bans publishing of judicial analytics and prompts criminal penalty” (7 June 2019), online: ABA Journal

    <http://www.abajournal.com/news/article/france-bans-and-creates-criminalpenalty-for-judicial-analytics>.

  46. Jena McGill and Amy Salyzyn, ‘Judging by the Numbers, Conseil Constitutionnel, Loi de Programmation et de Réforme Pour La Justice [Law on Programming and Reform for the Justice System] (No Décision n° 2019-778 DC, 21 March 2019) <https://www.conseil-constitutionnel.fr/rapport-activite-2019-numerique/dc-2019-778.php>.

  47. Straton Papagianneas and Nino Junius, ‘Fairness and Justice through Automation in China’s Smart Courts’ (2023) 51 Computer Law & Security Review 105897, 6 <https://www.sciencedirect.com/science/article/pii/S0267364923001073>.

  48. Samuel Hodge, ‘Revolutionizing Justice: Unleashing the Power of Artificial Intelligence’ (2023) 26(2) SMU Science and Technology Law Review 217, 229 <https://scholar.smu.edu/scitech/vol26/iss2/3/>.

  49. AI Lawtech, AI Lawyer: Your Personal Legal AI Assistant (Web Page, 2024) <https://ailawyer.pro/>; DoNotPay, ‘Save Time and Money with DoNotPay!’, DoNotPay (Web Page, 2024) <https://donotpay.com/>; Legal Robot Inc, ‘Know What You Sign’, Legal Robot (Web Page) <https://legalrobot.com/>.

  50. Michael Legg and Felicity Bell, ‘Artificial Intelligence and the Legal Profession: Becoming the AI-Enhanced Lawyer’ (2019) 38(2) University of Tasmania Law Review 34, 35–6.

  51. Ibid 55–6.

  52. Ibid 56.

  53. Legal Profession Uniform Law Application Act 2014 (Vic) Sch 1, Pt 2.1 s 10.

  54. ‘Unqualified Legal Practice’, Victorian Legal Services Board + Commissioner (Web Page, 2 August 2022) <https://lsbc.vic.gov.au/consumers/registers-lawyers/unqualified-legal-practice>.

  55. Note in MillerKing, LLC v DoNotPay, Inc 3:23-CV-863-NJR, 2023 WL 8108547 the law firm MillerKing LLC, challenged the online subscription service DoNotPay Inc, which offers a ‘robot lawyer’ to consumers for legal services, on the basis that DoNotPay’s robot lawyer was not actually licensed to practice law. The United States District Court dismissed the case, due in part to a lack of standing.

  56. Mia Bonardi and L Karl Branting, ‘Certifying Legal Assistants for Unrepresented Litigants: A Global Survey of Access to Civil Justice, Unauthorised Practice of Law’ (SSRN, 22 July 2024) 9–10 <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4901658>.22 July 2024

  57. Judicial College of Victoria, 3.1 Scope of discovery, Civil Procedure Bench Book (Online Manual, 2024) <https://resources.judicialcollege.vic.edu.au/article/1041737/section/2248>.

  58. Ibid.

  59. Michael Legg and Felicity Bell, ‘Artificial Intelligence and the Legal Profession: Becoming the AI-Enhanced Lawyer’ (2019) 38(2) University of Tasmania Law Review 34, 44.

  60. Ibid 44–48.

  61. Commercial examples include ‘Nuix Discover Technology’, Nuix (Web Page, 2024) <https://www.nuix.com/technology/nuix-discover>; Relativity ODA LLC, ‘Relativity – eDiscovery & Legal Search Software Solutions’, Relativity (Web Page, 2024) <https://www.relativity.com/>; Computing System Innovations, Intellidact AI Data Redaction (Web Page, 2022) <https://csisoft.com/intellidact/>.

  62. McConnell Dowell Constructors (Aust) Pty Ltd v Santam Ltd (No 1) [2016] VSC 734; (2016) 51 VR 421. In the course of arbitration between the plaintiffs and defendants over the liability of the defendants as insurers to compensate McConnell Dowell Constructors for the failed development of a natural gas pipeline in Queensland, 1.4 million documents were identified. The parties agreed to a TAR protocol where only 20,000 documents would be reviewed, assisted by a computer algorithm. The protocols required a ‘validation round’ to ensure the algorithm’s outcomes were consistent with the human assessment.

  63. Supreme Court of Victoria, Practice Notes SC Gen 5: Technology in Civil Litigation (Practice Note, Supreme Court of Victoria, 29 June 2018) <https://www.supremecourt.vic.gov.au/sites/default/files/2018-10/gen_5_use_of_technology_first_revision.pdf>.

  64. Karin Derkley, ‘Digital Discovery’ (2024) 98(9) Law Institute Journal 11 <https://www.liv.asn.au/Web/Law_Institute_Journal_and_News/Web/LIJ/Year/2024/09September/Digital_discovery.aspx?_zs=LK4dl&_zl=eHo43>.

  65. Ibid.

  66. Ibid.

  67. Standards Australia Limited, ‘AS ISO/IEC 22989:2023 Information Technology – Artificial Intelligence – Artificial Intelligence Concepts and Terminology’ 14, 51–2 <https://www.standards.org.au/standards-catalogue/standard-details?designation=as-iso-iec-22989-202>.

  68. Kim Neeson, ‘Is AI Coming to a Legal Transcript Near You?’ [2019] The Lawyers Daily <https://www.lexisnexis.ca/en-ca/sl/2019-06/is-AI-coming-to-a+legal-transcript-near-you.page>.

  69. For instance, see ‘EpiqFAST: Produce and Securely Share Live Transcripts of Legal Proceedings.’, Epiq (Web Page)

    <https://www.epiqglobal.com/en-gb/services/court-reporting/epiqfast>.

  70. Federal Court of Canada, Interim Principles and Guidelines on the Court’s Use of Artificial Intelligence (Web Page, 20 December 2023) 1 <https://www.fct-cf.gc.ca/en/pages/law-and-practice/artificial-intelligence>.

  71. The United Nations Educational, Scientific and Cultural Organization, Global Toolkit on AI and the Rule of Law for the Judiciary

    (No CI/DIT/2023/AIRoL/01, 2023) 73 <https://unesdoc.unesco.org/ark:/48223/pf0000387331>.

  72. Ibid.

  73. ‘AI Takes Legal Action’: Delhi Gets First “Pilot Hybrid Court”; Here’s How It Will Work’, The Times of India (online, 21 July 2024) <https://timesofindia.indiatimes.com/city/delhi/ai-takes-legal-action-delhi-gets-first-pilot-hybrid-court-heres-how-it-will-work/articleshow/111875546.cms>.

  74. Ibid.

  75. Office of the Chief Justice of New Zealand, Digital Strategy for Courts and Tribunals (Report, Courts of New Zealand, March 2023) 27 <https://www.courtsofnz.govt.nz/assets/7-Publications/2-Reports/20230329-Digital-Strategy-Report.pdf>.

  76. New York State Bar Association, Report and Recommendations of the New York State Bar Association Task Force on Artificial Intelligence (Report, New York State Bar Association Task Force on Artificial Intelligence, April 2024) 49.

  77. Ibid.

  78. Jill Hunter et al, ‘A Fragile Bastion: UNSW Judicial Traumatic Stress Study’ (2021) 33(1) Judicial Officers’ Bulletin 1, 7.

  79. Ibid 1.

  80. Sigurður Ragnarsson, ‘Using Automated Image Moderation Solutions to Reduce Human Trauma’, Videntifier (Web Page, 18 October 2023) <https://www.videntifier.com/post/using-automated-image-moderation-solutions-to-reduce-human-trauma>.

  81. Ibid.

  82. Nicola Galante, ‘Applications of Artificial Intelligence in Forensic Sciences: Current Potential Benefits, Limitations and Perspectives’ (2023) 137 International Journal of Legal Medicine 445 <https://link.springer.com/article/10.1007/s00414-022-02928-5>.

  83. Ibid 448.

  84. Faye Mitchell, ‘The Use of Artificial Intelligence in Digital Forensics: An Introduction’ (2014) 7 Digital Evidence and Electronic Signature Law Review 35 <http://journals.sas.ac.uk/deeslr/article/view/1922>.

  85. Gaurav Gogia and Parag Rughani, ‘An ML Based Digitial Forensics Software for Triage Analysis through Face Recognition’ (2023) 17(2) Journal of Digital Forensics, Security and Law Article 6 <https://commons.erau.edu/jdfsl/vol17/iss2/6>.

  86. Mfundo A Maneli and Omowunmi E Isafiade, ‘3D Forensic Crime Scene Reconstruction Involving Immersive Technology: A Systematic Literature Review’ (2022) 10 IEEE Access 88821 <https://ieeexplore.ieee.org/document/9858116/?arnumber=9858116>.

  87. Nayeon Lee et al, ‘Survey of Social Bias in Vision-Language Models’ (arXiv, 24 September 2023) 10 <http://arxiv.org/abs/2309.14381>.

  88. Henry Belot, ‘Australian Academics Apologise for False AI-Generated Allegations against Big Four Consultancy Firms’, The Guardian (online, 3 November 2023) <https://www.theguardian.com/business/2023/nov/02/australian-academics-apologise-for-false-ai-generated-allegations-against-big-four-consultancy-firms>.

  89. Robert Chesney and Danielle K Citron, ‘Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security’ (2019) 107 California Law Review 1753, 1785–6 <https://www.californialawreview.org/print/deep-fakes-a-looming-challenge-for-privacy-democracy-and-national-security>.

  90. Miriam Stankovich et al, Global Toolkit on AI and the Rule of Law for the Judiciary (Report No CI/DIT/2023/AIRoL/01, UNESCO, 2023) 20 <https://unesdoc.unesco.org/ark:/48223/pf0000387331>.

  91. Agnieszka McPeak, ‘The Threat of Deepfakes in Litigation: Raising the Authentication Bar to Combat Falsehood’ (2021) 23(2) Vanderbilt Journal of Entertainment & Technology Law 433, 438–9 <https://www.vanderbilt.edu/jetlaw/2021/04/09/the-threat-of-deepfakes-in-litigation-raising-the-authentication-bar-to-combat-falsehood/>.

  92. Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 (Cth).

  93. Ibid. Sexual Offences Act 2003 (UK) ss 66A, 66B introduced offences for the sending and sharing of intimate images.

  94. ‘Clearview AI Principles’, Clearview AI (Web Page, 2024) <https://www.clearview.ai/principles> Clearview AI is a facial recognition tool which claims to have a 99% accuracy rate in identifying individuals in any given photo.

  95. Rita Matulionyte, ‘Australia’s Privacy Regulator Just Dropped Its Case against “troubling” Facial Recognition Company Clearview AI. Now What?’, The Conversation (online, 22 August 2024) <https://theconversation.com/australias-privacy-regulator-just-dropped-its-case-against-troubling-facial-recognition-company-clearview-ai-now-what-237231> For example in 2021, Australia’s privacy regulator ruled Clearview AI’s facial recognition tool had breached privacy laws by using images from social media sites to train its AI facial recognition tool without individual’s consent.

  96. ‘Facial Recognition Technology: Towards a Model Law’, University of Technology Sydney (Web Page, 23 September 2022) <https://www.uts.edu.au/human-technology-institute/projects/facial-recognition-technology-towards-model-law> For example, the Human Technology Institute has proposed a model law for facial recognition in Australia.

  97. Victoria Police, Victoria Police Artificial Intelligence Ethics Framework (Policy, Victoria Police, 23 April 2024) <https://www.police.vic.gov.au/victoria-police-artificial-intelligence-ethics-framework>.

  98. Ibid. See also Victoria Police, Our Values – About Victoria Police (Web Page, 15 August 2024) <https://www.police.vic.gov.au/about-victoria-police>; Victoria Police, Victoria Police – Code of Conduct (Report, Victoria Police, February 2022) <https://www.police.vic.gov.au/sites/default/files/2022-02/Code%20of%20Conduct%20-%20Accessible%20Version.pdf>; Charter of Human Rights and Responsibilities Act 2006 (Vic).

  99. Australia New Zealand Policing Advisory Agency (ANZPAA), Australia New Zealand Police Artificial Intelligence Principles (Report, 14 July 2023) <https://www.anzpaa.org.au/resources/publications/australia-new-zealand-police-artificial-intelligence-principles>.

  100. Karin Derkley, ‘Digital Discovery’ (2024) 98(9) Law Institute Journal 11 <https://www.liv.asn.au/Web/Law_Institute_Journal_and_News/Web/LIJ/Year/2024/09September/Digital_discovery.aspx?_zs=LK4dl&_zl=eHo43>.

  101. Ibid.

  102. Ibid.

  103. For example, the Genie AI tool provides witness statement templates Witness Statement Templates – UK (Web Page, 2022) <https://www.genieai.co/document-types/witness-statement>.

  104. ‘Artificial Intelligence’, NSW Police Force (Web Page) <https://www.police.nsw.gov.au/about_us/research_with_nsw_police_force/research_themes/artificial_intelligence>.

  105. Rashid Minhas, Camilla Elphick and Julia Shaw, ‘Protecting Victim and Witness Statement: Examining the Effectiveness of a Chatbot That Uses Artificial Intelligence and a Cognitive Interview’ (2022) 37(1) AI & SOCIETY 265 <https://link.springer.com/10.1007/s00146-021-01165-5>.

  106. Victoria Turk, ‘This Bot for Workplace Harassment Takes the Bias out of Reporting’, WIRED (online, 9 October 2018) <https://www.wired.com/story/julia-shaw-spot-ai-workplace-harassment-reporting-startup/>; Julia Shaw, ‘How Reliable Are Witness Statements and Can AI Help Improve Them?’, Communities – The Law Society (Web Page, 5 February 2019) <https://communities.lawsociety.org.uk/features-and-comment/how-reliable-are-witness-statements-and-can-ai-help-improve-them/5066850.article>.

  107. State of Victoria, Victorian Regulators: An Overview (Report, Better Regulation Victoria, November 2021) <https://www.vic.gov.au/sites/default/files/2022-02/Victorian%20Regulators%202021%20Final%20-%20November%202021.pdf>.

  108. Environment Protection Act 2017 (Vic) The civil penalty provisions are listed in the table at section 314(3).

  109. Ibid. For example s 27 carries a penalty of 4,000 penalty units or 5 years imprisonment.

  110. Environment Protection Authority Victoria, Strategic Plan 2022-27 Environment Protection Authority Victoria (Report, 2022) 5 <https://online.flippingbook.com/view/122585736/5/#zoom=true>.

  111. Felicity Bell et al, AI Decision-Making and the Courts: A Guide for Judges, Tribunal Members and Court Administrators (Report, Australasian Institute of Judicial Administration, December 2023) 22; Peter Cashman and Eliza Ginnivan, ‘Digital Justice: Online Resolution of Minor Civil Disputes and the Use of Digital Technology in Complex Litigation and Class Actions’ (2019) 19 Macquarie Law Journal 39 <https://search.ebscohost.com/login.aspx?direct=true&AuthType=shib&db=a9h&AN=141372967&site=ehost-live&scope=site&custid=ns215746>.

  112. Felicity Bell et al, AI Decision-Making and the Courts: A Guide for Judges, Tribunal Members and Court Administrators (Report, Australasian Institute of Judicial Administration Incorporated, December 2023) 22.

  113. Peter Cashman and Eliza Ginnivan, ‘Digital Justice: Online Resolution of Minor Civil Disputes and the Use of Digital Technology in Complex Litigation and Class Actions’ (2019) 19 Macquarie Law Journal 39, 41 <https://search.ebscohost.com/login.aspx?direct=true&AuthType=shib&db=a9h&AN=141372967&site=ehost-live&scope=site&custid=ns215746>.

  114. Ibid 42.

  115. Vicki Waye et al, ‘Maximising the Pivot to Online Courts: Digital Transformation, Not Mere Digitisation’ (2021) 30(3) Journal of Judicial Administration 126, 127.

  116. Ibid.

  117. Michael Legg, The Future of Dispute Resolution: Online ADR and Online Courts (University of New South Wales Faculty of Law Research Series No 71, 2016) 3 <https://www8.austlii.edu.au/cgi-bin/viewdoc/au/journals/UNSWLRS/2016/71.html>.

  118. amica and National Legal Aid, ‘The Simple, Low Cost, Smart Way to Separate or Divorce Online’, Amica (Web Page)

    <https://amica.gov.au/>.

  119. Australian Government, ‘Amica – An Online Dispute Resolution Tool’, Attorney-General’s Department (Web Page)

    <https://www.ag.gov.au/families-and-marriage/families/family-law-system/amica-online-dispute-resolution-tool>.

  120. Vicki Waye et al, ‘Maximising the Pivot to Online Courts: Digital Transformation, Not Mere Digitisation’ (2021) 30(3) Journal of Judicial Administration 126, 134.

  121. CRT was Canada’s first online tribunal providing ‘end-to-end’ virtual solutions to resolve small claims disputes. Solution Explorer is the AI tool supporting CRT users as the first step before a trial even arises. British Columbia Civil Resolution Tribunal, ‘Solution Explorer’ <https://civilresolutionbc.ca/solution-explorer/>.

  122. Vicki Waye et al, ‘Maximising the Pivot to Online Courts: Digital Transformation, Not Mere Digitisation’ (2021) 30(3) Journal of Judicial Administration 126, 145; Shannon Salter, ‘Online Dispute Resolution and Justice System Integration: British Columbia’s Civil Resolution Tribunal’ (2017) 34(1) Windsor Yearbook of Access to Justice 112, 125 <https://wyaj.uwindsor.ca/index.php/wyaj/article/view/5008>.

  123. Vicki Waye et al, ‘Maximising the Pivot to Online Courts: Digital Transformation, Not Mere Digitisation’ (2021) 30(3) Journal of Judicial Administration 126, 145.

  124. Shannon Salter, ‘Online Dispute Resolution and Justice System Integration: British Columbia’s Civil Resolution Tribunal’ (2017) 34(1) Windsor Yearbook of Access to Justice 112, 121 <https://wyaj.uwindsor.ca/index.php/wyaj/article/view/5008>.

  125. Justice Aedit Abdullah, ‘Technology as a Bridge to Justice’ (Speech, Singapore Courts – Conversations with the Community, 30 May 2024) <https://www.judiciary.gov.sg/news-and-resources/news/news-details/justice-aedit-abdullah–speech-delivered-at-conversations-with-the-community-on-30-may-2024>. See also Chief Justice Sundaresh Menon, ‘Legal Systems in a Digital Age: Pursuing the Next Frontier’ (Speech, 3rd Annual France-Singapore Symposium on Law and Business, 11 May 2023)

    <https://www.judiciary.gov.sg/news-and-resources/news/news-details/chief-justice-sundaresh-menon-speech-delivered-at-3rd-annual-france-singapore-symposium-on-law-and-business-in-paris-france>.

  126. Lee Li Ying, ‘Small Claims Tribunals to Roll out AI Program to Guide Users through Legal Processes’, The Straits Times (online, 27 September 2023) <https://www.straitstimes.com/singapore/small-claims-tribunal-to-roll-out-ai-program-to-guide-users-through-legal-processes>.

  127. Simon Chesterman, Lyria Bennett Moses and Ugo Pagallo, ‘All Rise for the Honourable Robot Judge? Using Artificial Intelligence to Regulate AI’ [2023] Technology and Regulation 45 <https://techreg.org/article/view/17979>; Straton Papagianneas and Nino Junius, ‘Fairness and Justice through Automation in China’s Smart Courts’ (2023) 51 Computer Law & Security Review 105897 <https://www.sciencedirect.com/science/article/pii/S0267364923001073>; Changqing Shi, Tania Sourdin and Bin Li, ‘The Smart Court – A New Pathway to Justice in China?’ (2021) 12(1) International Journal for Court Administration 1 <https://iacajournal.org/articles/10.36745/ijca.367>; Alison (Lu) Xu, ‘Chinese Judicial Justice on the Cloud: A Future Call or a Pandora’s Box? An Analysis of the “Intelligent Court System” of China’ (2021) 26(1) Information & Communications Technology Law 59 <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2918513>.

  128. Kyle Denning, ‘Fully Online Civil Proceedings: Risks, Rewards and the Rule of Law’ (2024) 98(3) Australian Law Journal 210, 212–3 <https://search.informit.org/doi/10.3316/informit.T2024031700010001290470374>; A Victorian example is the Online Magistrates’ Court which applies across criminal, civil and specialist jurisdictions, enabling matters to be heard with parties appearing from remote locations. Magistrates Court of Victoria, Online Magistrates’ Court: Information about Online Hearings for Practitioners (Web Page, 6 October 2022) 15, 19 <https://www.mcv.vic.gov.au/lawyers/online-magistrates-court>.

  129. Tania Sourdin, ‘Technology and Judges in Australia’ (2023) 97 Australian Law Journal 636, 637.

  130. NSW Government, ‘NSW Online Court’, NSW Government, Communities and Justice (Web Page, 3 December 2023) <https://courts.nsw.gov.au/courts-and-tribunals/going-to-court/online-services/online-court.html>; Felicity Parkhill and Melissa Fenton, ‘Watch Your Inbox!: Online Court Has Arrived’ (February 2019) (52) Law Society Journal 90 <https://search.informit.org/doi/epdf/10.3316/agis.20190226007215>.

  131. As recognised by UNESCO, AI and the Rule of Law: Capacity Building for Judicial Systems (Web Page) <https://www.unesco.org/en/artificial-intelligence/rule-law/mooc-judges>. See also Felicity Bell et al, AI Decision-Making and the Courts: A Guide for Judges, Tribunal Members and Court Administrators (Report, Australasian Institute of Judicial Administration, December 2023) 32–34.

  132. The Rt. Hon Lord Justice Birss, Deputy Head of Civil Justice, ‘Future Visions of Justice’ (Speech, King’s College London Law School, 18 March 2024) <https://www.judiciary.uk/speech-by-the-deputy-head-of-civil-justice-future-visions-of-justice/>.

  133. For example see iCrowdNewswire, ‘CaseMark Unveils Revolutionary AI-Assisted Page-Line Deposition Summary Workflow: Transforming Legal Analysis with Unmatched Efficiency and Precision’ (Web Page, 13 February 2024) <https://icrowdnewswire.com/2024/02/13/casemark-unveils-revolutionary-ai-assisted-page-line-deposition-summary-workflow-transforming-legal-analysis-with-unmatched-efficiency-and-precision/>.

  134. Felicity Bell et al, AI Decision-Making and the Courts: A Guide for Judges, Tribunal Members and Court Administrators (Report, Australasian Institute of Judicial Administration, December 2023) 28–31.

  135. State of Wisconsin v Loomis 371 Wis.2d 235 (2016), [16]–[20]. Notably the judgment stated COMPAS scores could not be used to determine whether an offender is to be incarcerated or to determine the length of a sentence; [18] .

  136. Ibid [20].

  137. ‘Recent Cases: State v. Loomis’ (2017) 130(5) Harvard Law Review 1530 <https://harvardlawreview.org/print/vol-130/state-v-loomis/>.

  138. State of Wisconsin v Loomis 371 Wis.2d 235 (2016), [14].

  139. Julia Angwin et al, ‘Machine Bias’, ProPublica (online, 23 May 2016) <https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing>; Jeff Larson et al, ‘How We Analyzed the COMPAS Recidivism Algorithm’, ProPublica (online, 23 May 2016) <https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm>.

  140. Julia Angwin et al, ‘Machine Bias’, ProPublica (online, 23 May 2016) <https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing>; Jeff Larson et al, ‘How We Analyzed the COMPAS Recidivism Algorithm’, ProPublica (online, 23 May 2016) <https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm>.

  141. Ben Green, ‘The Flaws of Policies Requiring Human Oversight of Government Algorithms’ (2022) 45 Computer Law & Security Review 105681, 8 <https://www.sciencedirect.com/science/article/pii/S0267364922000292>.

  142. Ibid 2–3.

  143. Harriet Green, ‘Consciousness over Code: How Judicial Review Can Address Algorithmic Decision-Making in Policing’ (2024) 5(8) York Law Review 8, 17–19.

  144. Ibid 18–19.

  145. ANI, ‘In a First, Punjab and Haryana High Court Uses Chat GPT to Decide Bail Plea’, The Times of India (online, 28 March 2023) <https://timesofindia.indiatimes.com/india/in-a-first-punjab-and-haryana-high-court-uses-chat-gpt-for-deciding-upon-bail-plea/articleshow/99070238.cms>.

  146. Kimberlee Weatherall et al, Automated Decision-Making in New South Wales: Mapping and Analysis of the Use of ADM Systems by State and Local Governments (Research Report, ARC Centre of Excellence on Automated Decision-Making and Society (ADM+S), March 2024) 21 <https://apo.org.au/node/325901> In total, 206 departments and agencies were surveyed, with 77 responding.

  147. Ibid 23.

  148. Ibid.

  149. Ibid 26.

  150. Regulation (EU) 2024/1689 (Artificial Intelligence Act) [2024] OJ L 2024/1689, Recital (61).

  151. Ibid.

  152. Tania Sourdin, ‘Judge v Robot? Artificial Intelligence and Judicial Decision-Making’ (2018) 41(4) University of New South Wales Law Journal 1114, 118 <https://www.unswlawjournal.unsw.edu.au/article/judge-v-robot-artificial-intelligence-and-judicial-decision-making/>.

  153. Ibid 119.

  154. Changqing Shi, Tania Sourdin and Bin Li, ‘The Smart Court – A New Pathway to Justice in China?’ (2021) 12(1) International Journal for Court Administration 1, 9–10 <https://iacajournal.org/articles/10.36745/ijca.367>.

  155. Ibid.

  156. Maria-Elisa Tuulik, ‘Estonia Does Not Develop AI Judge’, Republic of Estonia, Ministry of Justice (Web Page, 16 February 2022) <https://www.just.ee/en/news/estonia-does-not-develop-ai-judge>.

  157. Republic of Estonia, Ministry of Justice, ‘AI Solutions’, RIK: Centre of Registers and Information Systems (Web Page)

    <https://www.rik.ee/en/international/ai-solutions>.

  158. Federal Court Strategic Plan 2020-2025 (Report, Federal Court of Canada, 15 July 2020) 16 <https://www.fct-cf.gc.ca/content/assets/pdf/base/2020-07-15%20Strategic%20Plan%202020-2025.pdf>.