CCTV camera
Credit: Nathy Dog for Unsplash

Minderoo Centre responds to publication of a report into Met’s deployment of facial recognition by The National Physical Laboratory.

We welcome the publication of Operational Testing of Facial Recognition Technology by the Metropolitan Police Service (MPS) and South Wales Police (SWP) with the National Physical Laboratory (NPL).

The tests highlight a substantial improvement in Live Facial Recognition accuracy and answer calls from civil society groups, community leaders and policy makers that AI systems used for decisions by public agencies have independent scrutiny and testing. 

However, there is still work to be done in ensuring that the use of these systems is fair, transparent, and accountable.  

The study shows that the live facial recognition system that the Metropolitan police uses has a very low false-positive rate with  statistically insignificant variation across race, age, and gender when used at a face-match threshold setting of 0.6 or above.

At lower thresholds, the performance of the system shows differences across different demographic groups.  

Knowing those thresholds is good and should guide Met policies going forward. However, it is concerning that this research happened after many facial recognition deployments, rather than proactively before deployments. For example, the ten Met Police trial deployments from 2016 to 2019 used a threshold setting of 0.55, which would have produced a worse performance for Black individuals compared to White individuals, based on the NPL’s research findings. 

Moreover, we need to consider ethical and legal concerns that go beyond the issues of accuracy and bias that the NPL evaluates. We need to understand and consider the impact of facial recognition on policing practice and on communities. This  is why a socio-technical audit, like the one that we have developed, is so important.

Recently, we released a sociotechnical audit tool to help outside stakeholders evaluate the ethics and legality of police use of facial recognition.

This audit tool can be used to understand the extent to which facial recognition deployments meet the minimum ethical and legal standards. The audit can be used to gather evidence and inform strategic litigation efforts to scrutinise police use of facial recognition. 

We encourage others to engage with the broad range of questions about ethics and legality that audit tools like ours bring together. For example, even if facial recognition is accurate, is it being used disproportionately on people of colour in a way that exacerbates over-policing? How does facial recognition impact the rights to privacy and freedom of assembly? Is there accountability and independent oversight over the use of facial recognition?

Considering the ethics and legality of facial recognition is essential if we are to protect fundamental rights.

Following the release of the NPL’s study, we call for three actions:

  Social capacity: We need to build social capacity, through community engagement and dialogue about potential deployments.

  Public understanding: We need to increase public understanding of facial recognition technology. Our socio-technical audit can help external stakeholders to assess the ethics and legality of police use of FRT.

  Regulation and oversight: We need further regulation. Much work is still to be done to address accountability and public capacity for oversight of facial recognition technology.

Furthermore, to protect human rights and improve accountability in how technology is used, we must ask what values we want to embed in technology and also move from high-level values and principles into practice.

 We need greater consideration not just about technologies, but about the broader structures in our society.

 

For further information please email minderoo@crassh.cam.ac.uk