Artificial Intelligence in Victoria’s Courts and Tribunals: Consultation Paper

9. Support for effective use of principles and guidelines about AI

Overview

• In this section, we explore the importance of appropriate governance arrangements within courts and tribunals to effectively implement AI principles and/or guidelines.

• Robust and ongoing governance is important to ensure the safe and responsible use of AI by courts and tribunals.

• Judicial independence is important. Embedding governance arrangements within courts would help to maintain judicial independence.

Governance

9.1 If guidelines or principles on the safe use of AI in Victoria’s courts and tribunals are to be effective, appropriate governance is needed. Governance may include a person or people within an organisation with responsibility and accountability to implement principles, guidelines, monitor use and oversee risk mitigation.

9.2 Governance arrangements should be ongoing. This will ensure appropriate and continual monitoring of AI applications in courts and tribunals. Principles and guidelines need to be monitored and revised as new technology emerges. This is particularly important if mandatory ‘guardrails’ proposed by the Australian Government are implemented.[1]

9.3 Governance is necessary to implement and manage the safe use of AI systems over the AI lifecycle. The ongoing management of AI systems is complex, as outlined in international standards,[2] which require appropriate procurement, periodic risk assessment and contract management.

9.4 The independence of courts is central to governance. If governments or external bodies have oversight of AI tools, it could unduly impact the independence of courts. Tania Sourdin has described judicial officers as ‘guardians’ of the justice system and emphasised their role in guiding its design.[3] Embedding robust governance arrangements within courts would help to maintain judicial independence.

9.5 In some jurisdictions the judiciary has led the development of guidelines about AI in courts and tribunals. In New Zealand, the Office of the Chief Justice led development of the Digital Strategy for Courts and Tribunals.[4] The strategy outlines a governance framework focused on periodic review by the Heads of Bench.[5] In Singapore, the Supreme Court established the Court of the Future Taskforce which has led to the strategic implementation of automation and AI initiatives since 2017.[6]

9.6 Some Victorian courts and tribunals already have AI advisory groups, which may provide the basis for governance. If governance arrangements were applied across the Victorian court system, frameworks could be applied consistently. This would allow rule changes to be considered as a whole and for lessons to be shared across courts and tribunals. A coordinated approach would also assist in greater interoperability of AI systems across Victorian courts and jurisdictions.

9.7 An appropriate governance approach would need to be coordinated by organisations across the Victorian courts and tribunal system. We are interested in whether bodies such as the following have adequate organisational structures and capabilities and their potential role in the governance of AI in courts and tribunals:

• Court Services Victoria—provides services and facilities to Victoria’s courts, Victorian Civil and Administrative Tribunal, the Judicial College of Victoria and the Judicial Commission of Victoria.

• Courts Council—the governing body of Court Services Victoria includes the head of each Victorian court jurisdiction, and has responsibility for directing strategy, governance and risk management.

• Judicial College of Victoria—provides education and professional development to the Victorian judiciary.

• Judicial Commission of Victoria—sets ethical standards for judicial officers and investigates complaints against judicial officers.

Questions

30. Are there appropriate governance structures in courts and tribunals to support safe use of AI?

31. What governance tools could be used to support the effective use of AI in courts and tribunals such as:

a. an AI register for AI systems used in the justice system?

b. accreditation of AI systems?

32. Who should be responsible for developing and maintaining these systems?

Education and training

9.8 Education is important in raising awareness of guidelines and regulatory responses. The novelty and complexity of AI suggests that court staff and administrators, counsel and

judicial decision makers, and the legal profession generally need extensive and ongoing

education.

9.9 Education will be significant in developing a safe approach to the use of AI in courts and tribunals and to meet the complexity and range of issues they face into the future.

9.10 The Judicial College of Victoria could be given responsibility to develop and coordinate educational programs and materials for courts and tribunals:

• understanding AI technologies and terminology, including basic AI concepts

• awareness of how AI is currently used in courts and the legal profession

• outlining the risks and limitations of AI

• opportunities for using AI responsibly, and sharing best practice

• legal, regulatory or professional obligations.

9.11 The responsibility for the profession more broadly requires a coordinated approach. This could include education and training by various professional bodies such as the Law Institute of Victoria, the Victorian Legal Services Board and Commissioner, the Victorian Bar and the Judicial College of Victoria.

9.12 There may also be opportunities to integrate AI education into the regulation and professional development requirements of legal practitioners.

Questions

How can education support the safe use of AI in courts and tribunals?

Are there opportunities to improve the current continuing professional development system for legal professionals about AI?


  1. Department of Industry, Science and Resources (Cth), Safe and Responsible AI in Australia: Proposals Paper for Introducing Mandatory Guardrails for AI in High-Risk Settings (Proposals Paper, September 2024) 2, 4.

  2. Standards Australia Limited, ‘AS ISO/IEC 42001:2023 Information Technology – Artificial Intelligence – Management System’ <https://www.standards.org.au/standards-catalogue/standard-details?designation=as-iso-iec-42001-2023>; National Institute of Standards and Technology (U.S.), Artificial Intelligence Risk Management Framework (AI RMF 1.0) (Report No NIST AI 100-1, 26 January 2023) <http://nvlpubs.nist.gov/nistpubs/ai/NIST.AI.100-1.pdf>.

  3. Tania Sourdin, Judges, Technology and Artificial Intelligence: The Artificial Judge (Edward Elgar Publishing, 2021) 206.

  4. Office of the Chief Justice of New Zealand, Digital Strategy for Courts and Tribunals (Report, Courts of New Zealand, March 2023) <https://www.courtsofnz.govt.nz/assets/7-Publications/2-Reports/20230329-Digital-Strategy-Report.pdf>.

  5. Ibid 32.

  6. Supreme Court Singapore, A Future-Ready Judiciary: Supreme Court Annual Report 2017 (Report, 2017) 4 <https://www.judiciary.gov.sg/news-and-resources/publications/publication-details/intranet.judiciary.gov.sg/news-and-resources/publications/publication-details/supreme-court-annual-report-2017>.