Court-based ODR: the need for an access to justice audit

Followers of the online court proposals in England and Wales will realise that there are currently facing some domestic concern. A National Audit Office (NAO) report has been critical. The Infrastructure and Projects Authority has warned. The House of Commons Public Accounts Committee is inquiring. But all these are little local difficulties with marginal relevance internationally except perhaps as warnings to other jurisdictions against hubris and undue haste. A more universal – and, in a way, altogether more interesting – issue is, however emerging. Court-integrated ODR raises specific access to justice issues. Success cannot just be seen in the numbers for savings. The Government make claims for increasing access to justice by digitalisation of the courts. But it has been shy of setting explicit targets and spelling out objective measurable gains. How might this be done?

In truth, the Ministry of Justice has been a little evasive about admitting this access to justice role of courts and tribunals. However, it gets pretty close. It expressly accepts that access to justice is a fundamental principle underlying its linked court closure and court modernisation programme (and uses it as a criteria in consultations on court closure). Access to justice is implicit in the ‘reform principles’ underlying the court modernisation programme and the first two were explicitly quoted by Lord Justice Briggs in the Interim report which that opened the modernisation process. They are:

A model built around the needs of those who use it (citizens, business users, visitors and overseas users, victims, witnesses and state users).

A system which is accessible – easy to use, digital by design and default and well supported for non-digital users.’

Let’s take these larger goals as including a commitment to a justice system accessible to all users – the provision of access to justice. 

Tellingly, the NAO report opens with a summary of ‘key facts’ with not a mention of access to justice among them. Its key facts are financial.  They are: £265m expected savings from 2023-24 onward; 5000 reduction in staff in the same year and a 2.4m planned reduction in court hearings each year. And what screams out from these figures is the absence of any key fact at all which relates to measuring performance against the needs of potential users. And, as has been drilled into managers for a number of generations and taken to heart by many governments to the point of tyranny: ’You can’t manage what you don’t measure’.  This, in turn, has spawned a number of variants – among them ‘What you don’t measure, doesn’t count’. Something that will prove all too true to anyone who has worked within a modern management structure.

The NAO is primarily concerned with financial coherence. So, that might excuse its selective vision. But, its key facts are insufficient overall. We need some way of summarising predicted access to justice outcomes. This gets us to the point of this post. How do we set the criteria for the access to justice element of a comprehensive audit of a proposed court modernisation programme?

Here are my suggestions to get the ball rolling. These four aims would provide my overall structure. 

First, the goals must conform with modern management thinking and be SMART – specific, measurable, achievable, relevant and targeted. Something aspirational but unmeasurable is useless. 

Second, the goals can’t be endless. The optimum number is probably something like ten in all. We are aiming for something easily applicable. These are intended as management tools not an academic research agenda. 

Third, they need to be able to be summarised as coherently as the NAO has put the financial goals so that they can sit alongside them. We are looking for measures which could be summarised in a form  which cannot really be longer than this example – ‘the resolution of 100,000 cases at £25 a head within an average three months for adequately satisfied users representative of the population as a whole’. 

Fourth, the audit goals need to incorporate a process of effectively reverse engineering the thinking which has been done about the needs for research after the event expressed, for example, by Professor Hazel Genn in a recent speech or by the Public Law Project in a recent publication. Professor Genn had nine points and the PLP thirty. So, this is going to be a challenge.

We might divide the prospective audit up into three parts. We need to be able to interrogate a court digital project’s conception; its practical implementation; and its monitoring. If you accept an overall practical limit of ten questions then these sections get about three questions each. That implies one limitation. A further comes from the fact that we actually know very little in many jurisdictions about existing use of the courts and we may also lack any measure of calculating need. We will have to do the best we can. Here are my starters under the three headings. 

Conception

1. Are there measurements available on the need for court and tribunal services by various elements of the population and how is it proposed that this digitalisation project will prove that it meets those needs?

2. Are there coherent plans for integrating digital court and tribunal services with existing advice and information provision and how will this be measured?

3. What provision will be in place to help those unable to use digital provision; how will its success be measured; and will users have the choice to use non-digital provision?

Implementation

1. How will individual users or their representatives such as the major advice providers be involved in the testing of provision and how will the project be divided up so that feedback can effectively be used and does the planned timetable allow for that?

2. Are their satisfactory arrangements for the integration of representatives into the process?

3. Do the proposed digital procedures comply with the principles of procedural fairness and fair trial?

4. What is the target cost for user fees and what are the target numbers for each stage of the digital procedure and by how much is it intended that the new procedures increase usage?

Monitoring

1. Are there satisfactory ways of monitoring the experience and satisfaction of users and the quality of procedures?

2. Are satisfactory provisions in place for the oversight, supervision and research of digital procedures?

3. How will the new arrangements be publicly reported upon.

I have spent a lifetime putting up ideas which are shot down by others as inadequate. So, I don’t mind if you think that these are rubbish but you have to say why and what would be better. But, I am unrepentant that we would benefit from an international debate on ideas that challenge the dominance of savings as a goal to what is a major reform of the courts and, thereby, the constitution. These ideas can be refined in subsequent discussion and you can contribute to this by tweet to @lawtech_a2j or email to rsmith@rogersmith.info either for publication or not. It will be interesting to see how this goes.

Leave a Reply