Online Courts: identifying and evaluating the impact on access to justice

The Online Courts programme for England and Wales has, once again, come under informed criticism by a Parliamentary Committee. This time it is Public Accounts. Its damning criticisms echo those of the National Audit Office reported in May and the headline comment was: ‘We have little confidence that HMCTS [Her Majesty’s Courts and Tribunal Service] can deliver this hugely ambitious programme’. The main objections are undue haste; lack of consultation; gaps in planned funding; and failure to understand the full implications of its reforms for itself and other institutions.  But, one major additional issue merits particularly urgent attention. HMCTS has been told to let the Public Accounts Committee know by January ‘how it will identify and evaluate the impact of changes on people’s access to, and the fairness of, the justice system, particularly in relation to those who are vulnerable.’

One stream of concern on access to justice is the well aired issue of court closures. The report covers the example of Chichester where court users are now being sent 80 miles away.  HMCTS’s partial answer to this problem is the installation of video links – and we must await evaluation by a team at the London School of Economics. 

However, the committee also reported (at paragraph 13): ‘When we asked the Ministry and HMCTS to explain what successfully transformed services would look like, they spoke in broad terms about the need to ensure that the justice system works better for the public. They were not able to be more concrete about how they would measure this or determine whether the programme had been a success. HMCTS cited examples of user satisfaction scores from digital services but not how it would measure how the reforms affected access to justice or the fairness with which it is administered. The published principles for reform cite the need for the courts and tribunals system to be proportionate, accessible and just. Neither HMCTS nor the Ministry could tell us how government intends to measure the extent to which this is the case.’

How helpful, therefore, that a previous contribution to this blog has suggested exactly how HMCTS might evaluate its reforms – and has had the benefit of external contributions, in particular from Darin Thompson of the Civil Resolution Tribunal in British Columbia. As a result, I would now update my earlier drafts and suggest three headings under which targets and key performance indicators be set – conception, implementation and (this is new and thank you, Darrin) commitment to continual improvement. Further useful comment on how the suggested indicators might be implemented was made by UCL’s Professor Richard Moorhead.

This basic list (to be supplemented by Richard’s comments on methodology) would then read (with the first two categories unaltered from the original and the third a redraft based on Darin’s suggestions). We might observe his general observation: ‘continuous improvement entrenches evidence-based decision making. Evidence in the form of analytics, business intelligence and user feedback should all combine to feed and inform the continuous improvement process. The age of anecdotes won’t stand a chance against it.’

Conception

1. Are there measurements available on the need for court and tribunal services by various elements of the population and how is it proposed that this digitalisation project will prove that it meets those needs?

2. Are there coherent plans for integrating digital court and tribunal services with existing advice and information provision and how will this be measured?

3. What provision will be in place to help those unable to use digital provision; how will its success be measured; and will users have the choice to use non-digital provision?

Implementation

1. How will individual users or their representatives, such as the major advice providers, be involved in the testing of provision and how will the project be divided up so that feedback can effectively be used and does the planned timetable allow for that?

2. Are their satisfactory arrangements for the integration of representatives into the process?

3. Do the proposed digital procedures comply with the principles of procedural fairness and fair trial?

4. What is the target cost for user fees and what are the target numbers for each stage of the digital procedure and by how much is it intended that the new procedures increase usage?

Commitment to continual improvement

1. Does the project design display sufficient commitment to the principles of continual improvement ie 

  • never assuming that whatever is built is 100 per cent right
  • not letting the initiative reach a static or stagnant state.  
  • always looking for potential improvements based on user data, staff input and other sources of evidence. 
  • Acting on the evidence to implement changes and improvements. 
  • Collecting evidence to measure the impact of your improvement activities.

2. Is there provision in the forward budget and the planning process to facilitate these goals.

HMCTS’s response to the Public Accounts Committee was in its usual style – breezy optimism: ‘Today’s report highlights the ambitious and transformational nature of our reform programme. We will study the committee’s recommendations and respond in detail. Significant progress is being made to deliver the programme, including new digital services which have seen high take-up and satisfaction rates.’ It explicitly accepted the need to engage better with its stakeholders. There was no reference to the need for better access to justice indicators but perhaps they will emerge in its response to the Public Accounts Committee in January.

Leave a Reply