I worked for the Law Society of England and Wales in the dead days as it abdicated – pretty shamefully – its regulatory function. Since then, it has struggled to find a role as a combination of representative and public interest institution. However, the publication this week of its paper on algorithms in the criminal justice system should do something to reassure its members that it has a valuable function in supporting the rule of law.
The report has three great values – which should be appreciated both internationally and domestically. First, it summarises the state of play in England and Wales of the use of artificial intelligence in the criminal justice system. Hitherto, much of the debate has taken place in the context, not entirely the same, of the United States. Second, it provides – particularly in chapter 5 – a sound discussion of the principles involved. Third, its recommendations are sufficiently precise that they could pretty easily be moulded into a draft bill which would be a possible next stage in the lobbying process for principled control of burgeoning algorithms.
Criminal justice algorithms have received much publicity in the wake of a case on which the US Supreme Court refused to take an appeal. This was State of Wisconsin v Eric Loomis. It is sometimes misrepresented, though not by the Society in its document, as a major setback in the regulation of algorithmic justice because the Supreme Court refused to consider a further appeal. To an English lawyer, actually, that may not have been a bad thing.
It is worth looking at the facts of the case because they illustrate some of the practical complexities. Mr Loomis was sentenced in the light of a report produced by a commercial proprietary product COMPAS – Correctional Offender Management Profiling for Alternative Sanctions. A research study by ProPublica suggested that this contained racial bias in terms of overestimating likely recidivism by black subjects. Mr Loomis’ case was, however, that, applied to him, the use of COMPAS was objectionable because it ‘violates a defendant’s right to due process, either because the proprietary nature of COMPAS prevents defendants from challenging the COMPAS assessment’s scientific validity, or because COMPAS assessments take gender into account’.
From a one-time litigator’s point of view, the Loomis case falls into the category of good argument, lousy facts – always a dangerous combination for test case litigation. And the uncomfortable (and, no doubt, technically irrelevant) facts have surely helped to curtail consideration of the legal issues involved to the extent that the Supreme Court’s indifference should not be taken – an English lawyer would hope – as the last word.
Mr Loomis ducked charges of perpetrating a drive-by shooting with a plea deal on lesser charges (fleeing a traffic officer and operating a motor vehicle without the owner’s consent): ‘The plea agreement stated that the other counts would be dismissed but read in’. He did not get the probation for which he hoped. The judge said, ‘You’re identified, through the COMPAS assessment, as an individual who is at high risk to the community. In terms of weighing the various factors, I’m ruling out probation because of the seriousness of the crime and because your history, your history on supervision, and the risk assessment tools that have been utilized, suggest that you’re extremely high risk to re-offend.’ He got six years – the maximum for the cases to which he pleaded but well below the maximum for the ones to which he did not.
The judge referred to the COMPAS assessment but explicitly gave additional reasons for his sentence. Mr Loomis had, after all, been returned to custody more than five times while on parole and had four new charges or arrests during probation. There was evidence that the COMPAS system is vulnerable to challenge on grounds of racial bias but the argument on gender seems frankly bit far-fetched – males and females do have different offending patterns.
What actually did for Mr Loomis more than COMPAS was probably the ‘read in’ offences where taken into account in sentencing – ‘You’re agreeing,’ the court put it to him, ‘as the Supreme Court decision indicates, that the charges can be read in and considered, and that has the effect of increasing the likelihood, the likelihood of a higher sentence within the sentencing range. You understand that?’. So, Loomis is a bit of a diversion and, actually, the decision of the Wisconsin Supreme Court contains plenty of caution about algorithmic justice. So the Supreme Court’s refusal to take the matter further should not be interpreted as the end of the matter, even in the US.
The Law Society report would have public disclosure of the algorithms developed for COMPAS. It wants a ‘National Register of Algorithmic Systems [to] be created as a crucial initial scaffold for further openness, cross-sector learning and scrutiny.’ This seems desirable and a reasonable trade off for any private entity having access to the public realm. It is true that given the nature of machine learning this might not add that much. But, ‘it is key that in- house capacity is built and retained for overseeing and steering these systems, and that coordination occurs across the justice system to ensure this capacity is world- class.’
Overall, Mr Loomis cannot have been that surprised by the sentence that he received and it would seem, on such facts as one can see from the judgement, pretty much what he might have expected whatever COMPAS said. However, it is foreseeable that judges may be tempted to be unduly influenced by a COMPAS-style recommendation. That is why the Society argues for a statutory amendment that requires that there be ‘meaningful human intervention’ between an automated prediction and a decision.
The Society’s paper produces a nice automated map of England and Wales showing where algorithms are being used in the criminal justice system. In her speech to the Law Society’s launch conference, President Christina Blacklaws set out the wide areas in criminal justice where algorithms were being deployed in different ways and to different standards: ‘current applications encompass: photographic and video analysis, including facial recognition; DNA profiling; individual risk assessment and prediction; predictive crime mapping; mobile phone data extraction tools; data mining and social media intelligence. She pointed to the pressure from budget cuts to deploy such technologies. Crucially there was ‘a lack of explicit standards and a lawful basis so that ‘algorithms are not being critically assessed’ and consequently ‘risks to the justice system and the rule of law’.
The Society’s recommendations are not to halt but to regulate the use of algorithms. ‘Firstly, there is a need for a range of new mechanisms and institutional arrangements to improve the oversight of algorithms used in the justice system … Secondly, we recommend the clarification and strengthening of protections relating to algorithms … Thirdly, consideration must also be given to the procurement of algorithmic systems to ensure that at all stages they are subject to appropriate control, and that due consideration is given to human rights concerns … Fourthly, our report also makes clear that all algorithms used in the justice system must have a clear and explicit basis in law … And finally, significant investment is needed to allow public bodies to develop the in-house capabilities necessary to understand, scrutinise and coordinate the appropriate use of algorithms.’
Ms Blacklaws’ final conclusion was that ‘We believe that the United Kingdom has a window of opportunity to become a beacon for a justice system that is trusted to use technology well, with a social licence to operate, and in line with the values and human rights underpinning criminal justice. It must take proactive steps to seize that window now.’ In the past, such demands might have been deflected through referral to the European Union. Brexit, should it ever happen, would allow us to go alone. How refreshing if we could become such a beacon.
Well done, the Law Society, for this report which it should seek to push further and creating a draft bill would be a way of bringing out the detail. This is exactly what a responsible legal professional body should be doing in terms of using the expertise of its members on a politically and constitutionally important topic. Let’s hope the members agree – they were pretty alienated when I was an employee.