It’s nothing but a paradox. We often talk about the Justice system as an antidote to human failures - the duty of the state to punish perpetrators of crime who often give in to intense passions of jealousy, deceit and misdemeanour. But what if this very system of justice is unable to maintain fairness and reeks of another human failing – biasness? What if the system, to begin with,...
It’s nothing but a paradox. We often talk about the Justice system as an antidote to human failures - the duty of the state to punish perpetrators of crime who often give in to intense passions of jealousy, deceit and misdemeanour. But what if this very system of justice is unable to maintain fairness and reeks of another human failing – biasness? What if the system, to begin with, is not even diverse enough to represent the society as a whole? It is undeniable that as humans we are imbued with prejudices that abound in our society. Manifestation of these biases in legal professions, decision making and policing can shake the very foundation of the Justice System which is placed firmly on public confidence. In this essay, it can be argued that for the claim that Artificial Intelligence (AI), as a possible panacea for these ills, can make the legal profession more diverse.
The magnitude of the problem can be further explained with a few examples. The first one concerns witness misidentification. In a famous report called “What the Brain Saw” some disquieting findings were revealed. When the subjects were shown an image of two Caucasian men fighting, they could rightly recollect which man was holding the knife and which one was not. But when the subjects were shown another image in which a Caucasian man was armed and an African American man was unarmed, majority of the subjects misremembered the African American man as holding the knife.
Secondly, it is a proven fact that Judges face quite a high risk of getting swayed by the “Gambler’s Fallacy”, a belief that when a series of trials have the same outcome then soon an opposite outcome must follow. Under its influence, a judge might give a harsher punishment to a defendant after giving a series of lenient punishments just to break the chain. As if that were not enough, certain reports have also found that much of their mood can depend upon the outcome of a football match or the room’s temperature.
The reality of Law firms is not any different. According to a report by The American Lawyer, the number of Minority Attorneys grew by an average of 0.9% per year. Another report titled “Walking out of door” by the American Bar Association had a similar picture to show: 45-50% of law school graduates were women. The obvious doubt, then, arises that if women and minority groups are seeing a greater representation, maybe the reality is not as grim as we’re made to believe. But here’s the catch. These reports also claim that there has only been a 3.9% increase of minority attorneys in the largest law firms and that women make a meagre 20% of law firm equity partners. The fact that women and minorities leave big law firms at a far greater rate than their male peers is another daunting aspect, proved consistently by these reports. But why are we unable to retain diversity at the top? Why is retention of minorities still a bleak prospect? No doubt, these questions are quite perplexing but the answer to these is pretty straightforward. Biases. The same biases that ultimately show up as social inequalities in our society and plague our justice system by denying equal access to justice.
These reports show that biases need not be necessarily explicit. It is the implicit, elusive ones that pose a greater threat. Consequently, there has been a growing demand for a system that is fairer, more equitable and better equipped at making objective decisions. AI ticks off all these boxes. Through early predictability, we can predict the decisions of a judge using his identity, past records and the defendant’s relevant information. This can then be used to alert a judge every time he resorts to heuristics to decide a case rather than the facts of it. Lawyers, law firms, police and eyewitnesses can use it similarly. AI can also do wonders in the hiring process. By detecting patterns that exhibit a bias and by writing off any information that has something to do with race, gender or ethnicity, it can significantly make our legal spaces more diverse.
It is quite amusing that when a particular actor from a Bollywood cult film made a passionate speech on how the courts keep giving new dates for proceedings instead of actually acting on the case at hand, his words resonated with all of us. Probably because we have all been victims of delayed court cases at some point in our lives. In India, there is a whopping 4.4 crore of pending court cases across all levels. Indefinite delay in justice brings down the confidence of the public in the judiciary, denies justice to the poor and increases the misery of those under trial. These facts behove us to rethink our current judicial practices and incorporate latest technological advancements that expedite justice and reduce the workload of judges. It is here that Artificial Intelligence fits the bill. In fact, applications like SUVAS and SUPACE are already being used by the Supreme Courts to give speedy justice. SUVAS, which stands for Supreme Court Vidhik Anwaad Software is a translational tool that helps to translate judicial documents in English to 7 vernacular languages. Supreme Court Portal for Assistance in Court Efficiency or SUPAS, as the name suggests, is another tool helps in reading case files, doing legal research and selecting the most relevant information.
Opponents of AI often suggest that since the problem at hand is attributed to a very low judge to case ratio, it should be tackled by increasing the number of judges instead of spending an astronomical amount on technology. But this is not a viable option. According to Economic Survey of 2018-19, clearing pending cases by 2024 would need going beyond merely meeting the sanctioned strength. And as it is, our courts are 20 percent short of the sanctioned numbers. Moreover, Lok Adalats, which were primarily set up to reduce a court's workload are not very reliable either. There is a growing concern that factors like unnecessary delay and mounting expenses deter people from challenging the decision of such courts. It's hard to ascertain in such cases if people have voluntarily accepted the ruling.
Although it is true that Artificial Intelligence is as good as its input data, it is, however, a lot more flexible than human decision makers, can be updated regularly based on the detected flaws, takes no offense when called out for having prejudices and is, most importantly not afraid to highlight biases if it finds any in the underlying data. Tools like IBM Watson Open scale, AI fairness 360 and many others can also be used to minimize such unfairness.
There is no doubt that AI can play a significant role in diversifying the legal profession but I think it is equally important to understand the role of diversity in improving AI machines. It can help in easing the process of bias detection since people who first detect a bias are mostly from that specific minority community. By working in tandem, these two factors can give our legal system a much needed overhaul and can transform it for the better.
Views are personal.