Remote Courts and the consequences of ending ‘practical obscurity’

A report from a US privacy advocacy group – the Surveillance Technology Oversight Project or STOP – picks up on some of the concerns Katherine Alteneder expressed in an interview published earlier this week. This is the issue of the balance between the conflicting demands of open justice and  privacy.

Practical Obscurity: an odd but powerful notion?

The report points out that  ‘in making court proceedings easier to access remotely, there is a loss of practical obscurity – an idea recognizing that “there is a privacy interest in information that is not secret but is otherwise difficult to obtain.” The suggestion  demonstrates that jurisdictions must do more to examine the privacy/open justice balance in relation to remote courts. In the rush to adapt to Covid 19 or, in our domestic case, to make savings, we cannot sweep difficult issues under the carpet: they need to be addressed at the point of initial implementation.

Open justice is historically an important constitutional principle but it was, of course, developed subject, as the report puts it, to ‘natural barriers to third-party observation such as time and travel built in, which are missing when courts utilize platforms like YouTube and Zoom to broadcast proceedings’. Online, anyone can watch ‘and nothing technically prevents a viewer from recording a hearing for personal use.’ This presents new problems for us to consider.

Privacy

‘If a virtual court is open to the public for remote viewing and security cannot control for viewers’ uses of video capturing technology, there is potential for rebroadcasting, recording testimony, and photographing shared evidence by anyone with internet access, which violates party and witness privacy rights. Improper recordings can occur even if public access is controlled and monitored, like in New York, because judges or court officers may not be able to determine or limit the participants involved in a proceeding. With any number of participants, a virtual court may struggle to determine if someone is making an unauthorized recording, let alone identify whom and impose proper sanctions.’

As a result, the correct response to the demands of open justice is not simply to allow access to the press and public to video proceedings without any constraints. And there are other privacy issues as well – such as the obvious one of incorporating some confidential form of lawyer-client interviewing capacity.

Deepfakes and digital manipulation

Less explored, but probably increasingly possible, is the potential use of “deepfake’ technology in terms of manipulating the apparent identity of witnesses or parties. ‘Programs such as Avatarify – publicly available as code on Github – superimpose another’s face onto a user in real time and is already being utilized on conferencing platforms’. And ‘AI software companies like SenseTime can create deepfakes from audio sources by using a third party’s audio clip and video of the user to generate footage of the user saying the words from the recording. This can not only allow a person to fabricate their identity but can allow a litigant or witness to use their own voice to make the claim that they said something different than what the opposing party claims.’ There is considerable potential here for the fabrication of online evidence by unscrupulous litigants or dishonest defendants.

Some ways of addressing false video identities may not be universally palatable. The Beijing Internet Court apparently requires litigants to set up an online account using their biometric information and data on their identity card. That won’t stop deep fakes but it does put some constraints on who is involved. Such rigorous state regulation would not be possible in most other jurisdictions. So we need to think through alternatives.

A further potential problem comes with uploading evidence. ‘Litigants must ask themselves if they are prepared to upload personal files … via online court websites run both by government agencies and an opaque web of private vendors.’ There absolutely must be cast iron assurances that all data about court proceedings is that of the courts, not the vendors of any technical equipment.

The Digital Divide

The US report considers much the same practical issues of the impact of the digital divide as have come up in the UK: ‘Perhaps the most obvious area of concern in moving court hearings and trials online is the digital divide, which perpetuates unfairness in access to proceedings or timely case resolutions due to disparities in tech ownership or familiarity. A low-quality internet connection or outdated hardware can result in transmission delays, degraded sound and image quality, and loss of connectivity, making a litigant look less truthful and persuasive.’

The report argues that there are two aspects, as would now surely be generally accepted, to the divide – what it calls an access and a skills divide: ‘Poor literacy in internet use and digital technology may affect not only procedural efficacy, but the perception of fairness in virtual court. Studies have shown that both case outcome and the ease of use of an online system correlate to litigants’ perceived fairness of court proceedings and their emotion to court officials.’

Best Practice

And, finally, STOP  has a list of suggested best practices – which we might begin to accumulate from around the world into one overall list to guide us everywhere:

1. ‘Courts should clearly communicate what technologies they use and how individuals’ personal information will be impacted, empowering participants to hold operators of virtual court toaccount for errors and abuses. detected and rectified. New technology should also ensure that mistakes can be quickly.’

2. ‘Courts must go beyond conventional terms of service, ensuring that every person whose privacy is impacted by virtual courts can provide truly informed consent. … As an example of what not to do, Michigan Cyber Court’s user agreement long-stated that …, parties should assume that information provided through the course of the mediation will not be kept confidential, unless otherwise agreed.”

3. ‘Courts should be especially sensitive to the confidentiality of litigants and evidence, such as conversations protected by the attorney-client privilege and evidence subject to a protective order.’

4. ‘An independent government watchdog must conduct routine and impartial security audits. There must be contingency plans for malfunctions and system failures, both during virtual proceedings and in the long term.’ In the UK, the monitoring body could be the Information Commissioner.

5. ‘Attorneys must also assess potential privilege issues triggered by remote proceedings    Courts have held in the past that third-party electronic monitoring may reduce a party’s reasonable expectation of privacy, and courts should address these issues on the front end by agreement with participants and the technology provider before engaging in remote proceedings to mitigate any risk.’

Now not later

The issue is no longer whether we proceed to remote courts. It is the terms on which we do so. And it would be far better to address difficult and messy areas now rather than in the context of post-implementation litigation and scandal.

1 thought on “Remote Courts and the consequences of ending ‘practical obscurity’

Leave a Reply