Back to the Engine Room: Seven Lessons on Implementation

The recently published Engine Room report has some helpful content on implementation of tech projects in an access to justice context. If you are involved in this field, you need it to read it yourself. This is a summary of seven lessons inspired by its content. The Engine Room analysis is mainly in quotes. Much of what is not is mine.

Lesson 1: technology, like therapy, can transform you not just your services

Properly executed technology projects have a potential for transforming not only service delivery but service delivery organisations: ‘Developing and implementing a technology-enabled legal empowerment project can have a profound effect on the way an organisation works. Thorough user testing and research can help organisations to rethink how to support people dealing with legal problems, and encourage a new, more collaborative way of working – both within legal empowerment-focused organisations, and with people themselves.’ This is the promise of the legal design movement which is not really specific to technology but encouraged by the reorientation that it requires. See your provision as focused laser-like on the needs of your users and, perhaps unsurprisingly and you may be jolted out of the traditional lawyer-expert paradigm.

Lesson 2: define your problem; do your user research; state your objectives and predict specific targets

Nothing inherently technological about all this either. It is a classic all-purpose management orthodoxy: ‘Clearly defining the problem that needs to be solved and then assessing how technology could help was an important strategy for many initiatives. In this way, organisations can use technology to supplement – rather than substitute – their existing in-person support.’ 

User research is a process that involves several stages: defining users, assessing their use of technology, creating content, testing prototypes, and conducting continued testing and interaction on the results. Without overdoing the point, the Engine Room indicates that this can take a substantial amount of time if done properly. It was often helpful to create dummy user profiles and personas to test provision. Research needs to include users’ pattern of technology use: ‘even initiatives with in-depth knowledge of their users’ legal needs said they did not fully understood their users’ ability to access technology until they had conducted in-depth user research. As an interviewee at [Brazilian project] Themis put it, based on their experience testing the app: “What works for some people in one place, won’t work for others in another place.”’

A crucial addition to the Engine Room analysis is the need to specify goals which are, in management jargon, SMART – specific measurable, achievable, relevant and targeted. Think of the guy who was set the objective of creating a 99p lightbulb for IKEA: he had to deliver it for about two thirds of the price at acceptable quality standards. That was his only performance indicator. No one should start on a technology (or other project) without setting similar specific goals against which their performance can be measured. And, despite understandable fears, failure to meet the projects pre-stated target is, by no means, failure of the project itself. Indeed, classic management theory would be that you can learn more from mistakes than successes.

Lesson 3: content is king and you must be clear

Creating and presenting clear, easily comprehensible content is key – the structure, style and tone of the legal information itself is as important as the technology used to deliver legal information. Many interviewees described repeatedly revising their content after learning more about the style and type of content that was needed.’ 

The report has some helpful tips on the creation of content. Some basic rules of good writing apply. I was taught to write by a fearsome woman called Hilary. She had some simple rules. You start each paragraph with a sentence summing up what follows; all sentence were never more than 24 words; you always reversed the usual lawyers’ order of conditions first and then substantive proposition; you decide on a reading age for your readers and stick to it. 

Lesson 4: test and test again: it ain’t fire and forget

‘Many initiatives said that it was crucial to build in space in the project design for flexibility and iteration. For example, one interviewee described frustration that every time they tested their tool with users, they received feedback “that it didn’t look good.” They described a process that took more than one year, noting that although it was frustrating: “It was definitely worth the time. I wish we’d known that patience would pay off.”

An important consequence is that ‘Project budgets that include funds to make alterations as the project progresses are likely to be better equipped to manage this need for iteration.’ Funders are resistant to this but it is crucial. Just think how often Apple or any of the major providers fiddle with their products. 

Lesson 5: Remember that you have got to sell your product even if you are a not for profit

‘Several interviewees felt that initiatives that aimed to produce technology tools aimed at the public did not invest enough funds in marketing the tool – with one US-based interviewee even suggesting that marketing should make up as much as 50% of the budget for an initial version of a tool.’ 

This is so important. You wont get 50 per cent of the budget for marketing from any funder that I have ever encountered but it invites a good exercise – what would you do if you had that money and what would be the minimum acceptable?

Lesson 5: sort out your position on privacy

Privacy increasingly raises questions for technology site – even one with beneficial goals like access to justice. found that users did not want to give personal information and changed its approach. If you do collect data on users then you need to comply with best practices in relation to keeping it – particularly if it could conceivably be privileged.

Lesson 6: benchmark your ideas against comparable projects in  your own and other jurisdictions. 

There is no point in reinventing the wheel. You have to identify what other providers in your jurisdiction are doing and similar projects abroad. ‘Most initiatives are designing and implementing their projects without an awareness of comparable projects outside their country, either as a source of ideas or to provide lessons on good practices …  Interviewees were often unaware of networks led by organisations such as the Hague Institute for Innovation of Law and Namati … There is now an opportunity to highlight and share experiences of successful technology use in legal empowerment more widely.’

Lesson 7: measure your impact

This is crucial and often incompletely done. The process is helped by setting clear objectives in the first place because you can then measure performance against them and you have done the hard work on methodology at the beginning. The Engine Room reported ‘Haqdarshak’s [an Indian project designed to increase welfare take up] evaluation framework tracks every application made for government benefits, as well as every application’s progress and outcome. Haqdarshak’s founder mentioned that tracking these data helps when talking with funders and potential investors. The organisation has a support centre with a dedicated three-person team who randomly call people who have had their cases logged in the system, to get data on their experience and the progress of cases.’

Leave a Reply