Victoria Legal Aid (VLA) has done us all a favour in publishing a ‘warts and all’ analysis of a tech project which did not work. The report is an honest appraisal of a project to develop an app which cost just under $50,000 and proved a dud. So rare is such transparency that is should be prized. We can all learn inestimably more from it than the more usual relentless optimism and plausible excuses of most end of project reports.
VLA decide to build an app, Below the Belt, for young people aged from 12-18 covering issues relating to ‘consent and age of consent, sexting and cyberbullying’. The content was nicely set out and contained things like age of consent calculators, tests, quizzes and a messaging function that allowed registered users to communicate with each other. It was scoped in 2011-12; planned and implemented in 2012-13; went live in November 2013 and closed in September 2014. At the beginning, it clearly attracted some enthusiasm among those concerned with community legal education. It was VLA’s first attempt at an app: ‘there was excitement …’ The consortium behind it contained five legal aid commissions and two community legal centres. During its short life, 1095 people installed the app (which I thought might be a pretty good response rate but was short of the 5000 planned) but only 40 created accounts. Damningly, 849 of the installers uninstalled. That left a net cost per remaining instal of $42. Alas, the app was ‘relatively cost inefficient’. Indeed, it was a flop.
What makes VLA’s transparency about the difficulties the more remarkable is that there was an easy excuse. The app was deliberately created for the Android operating system. However, upgrades to this led to fragmentation and it became unusable on upgraded phones. This was recognised in advance as a potential risk but it was assumed that ‘young people would use older and cheaper Android devices operating older operating systems’. Alas, the users appear to have upgraded and the app was, literally, a waste of space.
To its credit, VLA admits that difficulties lay deeper than the fragmentation of the app. Some related to the state of the emerging market for social media. On the messaging front, the app could not compete with the increasing dominance of products like WhatsApp, Instagram and Facebook over the period of gestation of the project. It also became apparent that apps have to be produced both for Android and iPhone products. In addition, the advertising budget was pitifully low compared to the commercial norm.
The evaluation calls for greater attention to what it terms the ‘value proposition for the client’ – or, in plainer English, the point. It argues that more attention should be given to identifying the basic purpose of the app. So, you should:
‘* identify the issue the client is facing; confirm if education is a solution; scope options, including [their] viability; test options and assumptions; decide on an option or decide not to proceed with any options.’
These are restatements of the classic lesson from any failed project and, indeed, many successful ones – spend more time at the beginning working out what you are doing and why. But they are as valuable to remember in relation to technology as anything else.
VLA is now asking whether an app was the best way to communicate the kind of information which was its subject in this case. The evaluation notes that most young people associate apps with entertainment not education. There may be more of an issue about how suitable the app is, as a form, for education more generally. The report notes that some of the content has been recycled in an age calculator to go on VLA’s website and for a young person’s programme. There is some really good stuff on bullying and sexting on the web already (for example, that from the UK National Crime Agency) and it is difficult perhaps to see how an app might be superior to these existing sites. But, the report is right to emphasise that we need to know from experience what works in app or on the web and what does not.
Crucial to this evaluation are the precise numbers provided by technology. They give you nowhere to hide. If this had been a booklet, the good news of distributing over 1000 copies would have been untarnished by the knowledge that 800 of them ended in the bin. You might have guessed that but you couldn’t prove it. Now, argue that you need a twitter account and we can all see how many followers you get. Persuade a funder of the need for a website and Google Analytics means that your performance can be measured with precision. Set up a blog, you are as good as the numbers who read your last post. And, we can see not only the gross number of your hits but the net figures (less those who don’t stay) and how long and where they spend their time. We need, internationally, to be able to make predictions with relative accuracy about how much trade we can expect from a website or an app. This is going to affect where money should be spent. And, crucially, it will assist us in the tricky judgement about how good technology really is compared with other methods of communication.
Those involved in access to justice have limited budgets. We tend to have one shot at success in a way that rarely limits commercial operations. So, the sharing of evaluations like this is really valuable both for the organisations involved and for a wider audience. And at the heart of the evaluation are the numbers. Every nerve of every experienced project pitcher will scream at the prospect of publicly proclaiming detailed targets and performance against them. But, it is really essential. Under all the guff, how did you really do? Congratulations to VLA for telling us so transparently. You might almost say that the value of the project was saved by the evaluation of its failure.
PS Since you clearly have to put your money where your mouth is, watch out for a forthcoming post on the predicted and actual figures for this website and the associated twitter account.