Automatic Facial Recognition (AFR) was the focus of a recent high court ruling on a challenge by Mr Bridges to its use by the South Wales Police. The court ruled that – in this specific case – the use was legitimate but the judgement raises interesting questions for public authorities considering the use of this technology on their land or buildings.
AFR is a sophisticated form of capturing and analysing facial features that goes much further than the passive indiscriminate scanning that we are familiar with from use of CCTV. In the Bridges case, the facial data was extracted from the scanned image and compared against a number of different watchlists.
So far it has been used mostly in a discrete and targeted way by the police at events such as football matches, festivals and parades and the Bridges case recognised that there are many potential positives with its use: in that case it led to a number of arrests (including someone who had previously made a bomb threat), and freed up police resources. However, the case also raised legal issues in three distinct areas:
1. Human Rights
The Bridges case confirmed that use of AFR by a public body does engage – and infringe – the 'right to privacy' under Article 8 of the European Convention on Human Rights, even though the images were captured in a public space and retained for very short time. In this case the use was held to be justified – because the police were transparent about it and the use was for a specific purpose, for a limited time and over a limited spatial footprint and the data was not retained. The court also recognised that the positive outcomes from use of AFR could not have been achieved with CCTV.
2. Data protection
The court considered the specific scope of data protection powers and responsibilities of a law enforcement body under the (Part 3) Data Protection Act 2018 so much of the ruling is simply not relevant in a wider context. However, the court did establish that AFR data 'clearly does comprise personal data' for (Part 3) Data Protection Act 2018 purposes and that 'it is beyond argument' that it is also 'biometric data' requiring specific consent to be given for its use.
3. Public Sector Equality Duty
Although the court said that there was an 'air of unreality' about this part of the claim, it is interesting to note that the police carried out an Equalities Impact Assessment as part of the decision to use AFR. In addition, they also carried out a human check on all potential matches – which the court described as an 'important failsafe' (156). Finally the court acknowledged that there is some evidence that algorhythmic datasets can be biased, and noted that in view of the evidence produced the police might consider 'whether the NeoFace Watch software may produce discriminatory impacts'.
Conclusion: think before installing
As local authorities seek to do more with less in terms of resources, and the scope of technology improves, the use of AFR on publicly owned land and buildings will seem increasingly attractive. However this case - perhaps the first of many - highlights the need to think hard before using such technology about whether the benefits justify the interference, whether adequate safeguards are in place and if equalities impacts have been given the 'due regard' that the law demands.
Sue Chadwick is strategic planning advisor for Pinsent Masons LLP