The biggest lesson to be learned from what has happened with technology in the recent past is that online tech needs to be subjected to close scrutiny. Data privacy and cybersecurity are obvious concerns, at least 29 million Facebook users were victims of data breaches and 13.4 million files were leaked from law offices in the Paradise Papers leak. However, where legal tech is concerned, in my view, data security isn’t even our biggest problem. Instead, the larger threat posed to the integrity of our legal institutions is the construction of categories in code and reliance of AI on those categories. Mechanistic and rigid modes of organizing data threaten to fundamentally undermine the case by case, nuanced decision-making that is foundational to the common law tradition.
Ontario lawyers need to pay close attention to how data is organized in legal tech, or the results could be catastrophic to the continuation of the common law tradition. In 2019, we are emerging into a moment where legal tech, analytics, and artificial intelligence are increasingly central to the practice of law and administration of justice, just as serious concerns are being raised about the dangers of that technology. Emergent legal tech now provides mechanisms for more efficient practice, such as the secure ledger facilitated by blockchain, and for more effective, collaborative, mobile, faster means to do legal work, such as AI-augmented legal research. The platforms enabling this are constructed through a labyrinthine matrix of details, where the operation of languages of code, and manipulations of algorithms, mix with the discourses of formal law. Additionally, to protect the common law tradition, in the coming year, and forward, new levels of scrutiny should also be involved in how legal tech platforms organize, and make sense of, data.
While it is taking on new forms, reliance on technology in law itself is not new. Across sectors, we are now experiencing a rapid mobile technological revolution of disruptive innovation that is driving unprecedented change to professional fields and personal lives. The legal sector has lagged behind other fields in engaging in this mobile revolution, but disruptive mobile legal technologies are going mainstream in the legal profession.
Lawyers now depend upon online technologies to survive in practice in new and different ways. It is simply not a viable option to reject legal tech at this point. AI and online tech are part of our day-to-day practice. A central concern arises from the presence of increasing levels of privately designed and operated technologies in the public spaces of our social institutions of law, especially courts.
Regulatory gaps always exist because laws respond to advances in society and technology, and change takes place. The current moment makes this problematic because the gaps get wider as technology advances more rapidly. In every legal space that technology touches, there are quandaries and conundrums to be resolved and attended to. One important problem that our regulatory regimes need to work through is that the common law is founded on nuanced, individualized decision-making while AI and online tech work with big data and mechanistic decision-making. Cautious and active engagement is the way forward to preserve the integrity of common law social institutions in the face of technologies that facilitate decisions to be made and actions to be taken in mechanistic ways.
At the same time that online and mobile tech is emerging into the mainstream of legal practice, so too are concerns about dystopic levels of surveillance. The privacy and data security concerns we have seen writ large in the context of Google and Facebook/Cambridge Analytica scandals together with the Panama and Paradise Papers signal close attention must be paid to the security of data in legal databases and legal tech apps. More notice should also be taken of how that data is collected and organized in the legal milieu.
We need to closely scrutinize not just how securely tech apps store data but also how databases categorize, define, describe, and code. The logical frameworks these technologies employ not only create online worlds, they will affect the physical world in important ways. In the tech-enabled world of online apps and artificial intelligence, definitions and the details of how code is written for legal tech, are often addressed privately, by non-lawyers, in an opaque context within corporate tech companies. The languages and definitions upon which these platforms rely should be more closely engaged with in the public interest. In these online platforms, definitions and algorithms have powerful influences on how we understand each other.
Clearly, avoiding tech is no longer a viable option for lawyers. Further, an analog legal system is not a panacea. As Yuval Noah Harari wrote in 21 Lessons for the 21st Century, “The current technological and scientific revolution implies not that authentic individuals and authentic realities can be manipulated by algorithms and TV cameras, but rather that authenticity is a myth.”
Whether or not it is tech-enabled, we exist in a context of social systems and legal institutions produced by our definitions and descriptions. Just as we need to know the stories of the parties before the court, to appreciate facts to properly apply the law, we must be wary of how the calculations embedded in online algorithms — all of those ones and zeros, with their binary, black and white thinking — can oppressively undermine the individualized approach that is so foundational to the common law tradition. Humanistic discretion in case-by-case decision making remains important. We must continue to recognize that maps are not the same thing as territory.
Rebecca Bromwich is a practising lawyer and legal academic. She teaches at Carleton University’s Department of Law and Legal Studies and is the recipient of a Law Foundation of Ontario research grant to study how legal tech is affecting access to justice in the family law context.