
In this article I discuss the increasing prevalence of Police Facial Recognition Cameras, whether their use infringes civil liberties/ rights and what you can do if you have been wrongfully arrested or detained as a result of Facial Recognition technology.
This is a continuance of last week’s blog post on the increasing number of arrests based on Police use of Facial Recognition Technology and the wrongful arrests which may arise as a result of its use.
Is there Ethnic Imbalance in the way Live Facial Recognition Works?
A Live Facial Recognition (LFR) camera takes digital images of a crowd moving through the “zone of recognition” in real time.
The LFR software then automatically detects individual human faces amongst the crowd, and extracts facial features from the image to create a “biometric template”.
The LFR software (the Genie in this particular bottle) then compares the biometric template with those of the faces on the pre-determined Police watch list.
The facial images from the crowd are compared against the facial images from the watch list, and the LFR system generates a “similarity score”, a numerical value indicating the extent of similarity between the faces. The operators of the system will have set a “threshold value” to determine when the LFR software will generate an alert indicating a possible match.
The decision-making process then reverts from the robot to the human: Police Officers must review the alerts and make a decision as to what, if any, further action to take.
The Police guidance set out in the Authorised Professional Practice for live facial recognition seeks to reassure us – “In this way, the LFR system works to assist Police personnel to make identifications, rather than the identification process being conducted solely by an algorithm.”
Early, legitimate, concerns that LFR algorithms were mis-identifying black people’s faces more often than white people’s, and women’s more than men’s, (see the Bridges judgment of the Court of Appeal, 2020 ) have to some degree been assuaged, and the pace of technological change is such that these algorithms are surely only going to become better and more accurate with each passing year. The Met Police point to a report commissioned from the National Physical Laboratory, which seems to confirm that LFR systems, when used at the highest settings, now produce no statistically significant difference in outcomes between demographic groups (i.e people of different genders/ ethnicities).
But the devil, as always, is in the detail. LFR algorithms have configurable settings for face detection – good quality frontal face images of sufficient size giving the most accurate identification results. The settings can be ‘lowered’ to increase the number of faces processed – but such a relaxation of the criteria, allowing poorer/less accurate images to be used by the system increases the “false match rate”.
As I highlighted above, at the highest ‘face match’ settings there were no false positives but at lower settings – casting the net wider – not only did “false positives” occur but they were disproportionality higher for black subjects than for Asian or white subjects – “The demographic variation in the nominated score distribution does not affect equitability if settings are such that the chance of a false alert is very low. However, if settings allow for a higher number of false alerts, these are likely to occur disproportionality within black or Asian ethnicities.” (NPL report, 9.3).
The NPL report goes on to state that the “False Positive Identification Rate” (FPIR) is equitable between gender and ethnicity and age at face-match threshold 0.6 and above. At face-match thresholds lower than 0.6 FPIR ‘equality of outcomes’ varies between demographic groups dependant on the settings of operational deployment, including the size and composition of the watch list and the number of crowd subjects passing through the zone of recognition during deployment, which led the NPL to caution – “Given our observations on the demographic variation in FPIR, we would recommend, where operationally possible, the use of a face match of 0.6 or above to minimise the likelihood of any false positive and adverse impact on equitability” (NPL report, 1.4.6).
All of this must be considered through the lens of the Equality Act 2010 – Chief Officers must demonstrate compliance with their non-delegable Public Sector Equality Duty (PSED) under Section 149 Equality Act 2010, particularly in terms of taking steps to “rigorously” understand and monitor their LFR system’s algorithmic performance in relation to statistical accuracy and demographic variants (which as, highlighted above, depends on the settings that its operators have implemented).
As the APP guidance enjoins its Officers, Forces must “Satisfy themselves that everything reasonable that could be done has been done to ensure that the software does not have an unacceptable bias on any basis, including on the grounds of race, sex, religion or belief. No system is every 100% non bias. There is always something within the system (and operator). Forces need to identify and understand the degree to which this occurs and then mitigate against this.”
Clearly, therefore, this is still an area of some concern, and needs to remain on our own – shall we say – ‘watch list’.
Practice not Theory: Legal Remedies for a ‘False Positive’ Arrest
As the views I have expressed in these posts make clear, I think a good argument has been made for the usefulness of this technology in modern day policing, and so I am prepared to concede the theory of the case. But that doesn’t alter the fact that I will always strive to bring to each and every individual case in which a person consults me because they have been unlawfully arrested/detained by the Police, an analytical attention to detail, a questioning of the facts to find out what went wrong –and a refusal to stop until the full truth has been uncovered. This is because whenever theory is put into practice, mistakes and abuses can occur, and when they do I offer my 30 years of success in litigation against the Police as proof of the fact that if there is a path for a wrongfully arrested person to achieve restitution/compensation, I will find it.
The checks and balances on this most recent of Policing tools already exist, and so, if you have been wrongly arrested on the basis of a supposed facial recognition match don’t hesitate to contact me for advice.
My daily job is holding Police power to account, and in regard to this new technological power, the following considerations are particularly pertinent –
- have the LFR cameras been used in an overt way, or have they been deployed in a manner constituting covert surveillance, thereby potentially breaching the Regulation of Investigatory Powers Act 2000 (RIPA)?
- have the algorithmic settings unfairly increased the risk of non-white faces being misidentified (clearly a problem which still exists despite significant improvements); or indeed have they been set so low, or provided with such inadequate material in the form of the ‘watch list’ (blurry/ low- resolution or older images), that the risk of “false positives” for all demographic groups is too high?
- all the hazards and risks dependent in the human part of the process: i.e decision- making and deployment of Officers to respond to the potential face matches:
- notwithstanding what the algorithm has indicated – is it reasonable to believe that a person identified by the LFR system actually is the person on the watch list?
- is it reasonable, in all the circumstances of the case, to use force or threats of force to detain the suspected ‘match’ ?
- have the Officers on the ground actually got the right person? – errors in this regard are bound to occur, just as much as they do in the deployment of ‘stingers’ against innocent people’s vehicles.
- the Police and Criminal Evidence Act 1984 – in particular Code G (governing the rules of a lawful arrest) and Code D (identification procedures)
- the Human Rights Act 1988 – use of LFR may engage Article 8 (the right to private and family life) and/or Article 9 (freedom of thoughts, conscience and religion), Article 10 (freedom of expression) and Article 11 (freedom of assembly and association)
- the Data Protection Act 2018
- has the authorisation for deployment of LFR been given by an Officer of senior rank (generally, not below the rank of Superintendent), defining the boundaries of time/ geography for the deployment and sufficiently justifying the same in writing?
The Authorised Professional Practice also sets out the ‘paper trail’ of documentation which must be created in support of each LFR operation, and which a lawyer such as myself would seek to obtain and interrogate in the event of a wrongful arrest occurring. These include –
- LFR Standard Operating Procedure including the criteria for watch lists; sources of imagery; guidance for when an alert is generated and arrangements to ensure that the deployment is overt (e.g. signage) : Setting the Forces’ false alert rate in policy and assessing the success of deployment against these metrics to ensure ongoing proportionality of use and reassurance to the public
- the written authority document for each LFR operation, outlining the aim of the deployment and, in compliance with the Human Rights Act, explaining how and why the deployment is necessary (not just desirable) and proportionate
- Data Protection Impact Assessment (DPIA) : explaining what the “pressing social needs” are for each particular LFR deployment; why sensitive processing is needed to achieve the legitimate aim and why the purpose cannot be achieved through less intrusive means
- Equality Impact Assessment (EIA)
- Community Impact Assessment (CIA)
- LFR training materials so that those Officers and staff using the technology fully understand its technical capabilities (and limitations) and how to properly respond to an alert.
In Conclusion
We can’t alter the pace of technological change, but we can ensure the integrity of fundamental rights and upon them we should accept no encroachment.
From here on the front lines, I am confident that Police misconduct experts such as myself, who:
- know what they are doing
- know how to obtain the necessary disclosure from the Police; and
- can read between the lines to build our clients’ cases
will be able secure justice and win compensation for those wrongfully arrested as a result of Facial Recognition technology, using the existing laws and policy safeguards which ringfence its deployment, as set out above.
But as ever, those laws and rights must be exercised in practical terms to ensure that they are not lost, and that they fill both their primary purpose of compensating the wrong individual and secondary purpose of protecting others from similar harm, by policing the Police.
Let the Police have the best modern tools they want; lawyers like me will use the best traditions of the law to Police them when they misuse those tools – if you’ve been wrongly identified by facial recognition, seek expert legal advice as soon as possible – hold power to account, don’t unplug it.
How you can help me
I hope that you have enjoyed reading this week’s blog post and the many others available on this website. If you have, then I would like to ask you a favour – in a world in which large and non-specialist law firms (generally from a personal injury background) are increasingly throwing huge marketing budgets into online advertising in order to ‘capture’ Actions Against the Police clients – I need your help to ensure that those in need of real expert advice come to the best place for representation. If you value the insights and expertise which I share on this blog, and the results which I have achieved for the people whose stories are recounted here, please post a positive review on Trustpilot to get the word out. Every 5 star review I receive makes a big difference in helping those who need the right advice to come to the right place. Thank you!


















You must be logged in to post a comment.