‘Ethical’ artificial intelligence no silver bullet to due process issues, says panel

The Data & Design Symposium was put on by The Action Group on Access to Justice

‘Ethical’ artificial intelligence no silver bullet to due process issues, says panel

While so-called ethical AI is often touted as a solution to the issues posed by technology, fairness is “context specific” and cannot be generalized between forums like criminal courts and administrative tribunals, said Law Commission of Ontario’s Nye Thomas.

“You don’t have to be on the Supreme Court to know that technology raises numerous due process issues,” said Thomas, the LCO’s executive director, who spoke at an event at the Law Society of Ontario on Wednesday.

The session where Thomas appeared also featured work from Linda Rothstein, Chair, Law Foundation of Ontario; CLEO Executive Director Julie Mathews; Suzanne Stein of OCAD University and Meredith Brown of Calibrate Solutions.

Bail is the most common area of AI integration in the U.S., and is used in jurisdictions housing more than a quarter of the U.S. population, said Thomas as part of the Data & Design Symposium put on by The Action Group on Access to Justice. The technology — used to measure risk factors such as re-offending or failure to appear — has been controversial.

“Criminal justice data has got generations of historic bias — systemic bias — against African Americans and other communities in the U.S.,” said Thomas. Some people argue that such technologies should not be used at all, said Thomas, because “if the data is inherently discriminatory it means the outcome is inevitably discriminatory.”

However, the reality is not so simple, Thomas said. Such systems were initially praised across the criminal justice system as a way to reform the bail process away from money bail and toward evidence-based pre-trial risk assessments, he noted.

“Not all data is discriminatory, but all data is biased. Data is not neutral,” he said.

Although Canada does not have money bail, it’s fair to ask whether and how the discrimination issues could be surmounted, said Thomas. To start, people procuring such technology for use in Canada should look at whether a private manufacturer will offer the same level of disclosure as a public manufacturer. Sometimes, he said, the use of the technology will only come to light through Freedom of Information requests, press reports or litigation, and private companies take a “black box” approach.

Thomas also noted that while data can have a “transfixing” quality, it’s important that decision makers don’t allow a technology-based predictions to become policies without considering the full context of each accused person.

Bail is not the only area where technology is impacting the practice of law, with artificial intelligence seeping into legal information, advice and research as well as e-discovery, smart-contracts, facial recognition and predictive policing, said Thomas. But he said the bail system in notable because it represents a growing trend toward using technology for decision-making.

CLEO has been working on “Steps to Justice,” which provides digital tools for people facing legal issues, and can be shared through buttons and content on other websites as well. The project is constantly adding high-quality legal resources for groups such as Indigenous people charged with a crime and victims of crime and topics such as immigration and refugee problems and income assistance.

“Steps to Justice” also uses “guided pathways” to help people fill out forms for family law and housing law issues.

Matthews said at the event that the Steps to Justice website has gotten 2 million visits, and has helped visitors through tracking which devices users use, and how most users visit the "answer page."

Stein, like Thomas, noted that the access to justice conversation must move to include technologists who may not have previously heard the issue framed in the same way. The Super Ordinary Lab within OCAD University has previously explored legal issues through a project called “Domestic Abuse and the Law: Confronting Systemic Impacts.” The project looked at ideas such as giving women cell phones linked directly to 911.

Now, CALIBRATE Solutions and OCAD are working with the Law Foundation on a survey on “the opportunities and challenges facing legal tech entrepreneurs in access to justice.”

Access to justice, said Stein’s handout at the close of her talk, is “still waiting for disruption.”

“Legal technologies today are disrupting law firms and how they work, creating efficiencies, new services and new ways of working,” the survey website said. “However, Canada is largely still waiting to see this innovation address the deep access to justice challenges of people facing everyday legal problems.”

Related stories

Free newsletter

Our newsletter is FREE and keeps you up to date on all the developments in the Ontario legal community. Please enter your email address below to subscribe.

Recent articles & video

Court overturns robbery and firearm convictions because Charter breach undermined trial fairness

Employers should focus on proactive steps on mental health, not just legal requirements: lawyer

Ontario Superior Court dismisses insurance claim over water damage incident in a restaurant

Ontario Superior Court approves settlement for victim of fatal traumatic brain injury

Amid animal-related family law reform push, TMU launching pet law legal clinic

Legal Aid Ontario appoints Fallon Melander as AVP of the new Indigenous Services Department

Most Read Articles

Amid animal-related family law reform push, TMU launching pet law legal clinic

Ontario Superior Court orders insurer to disclose documents in fatal motor vehicle collision case

Waivers in Ontario: Are they legally binding?

Ontario Superior Court grants bankruptcy order over unsettled million-dollar debt