A CCTV camera
Credit: Nathy Dog

This research paper presented at FAccT ’23, evaluates police use of facial recognition technology in the UK.

Algorithmic audits are increasingly used to hold people accountable for the algorithms they implement. However, much work remains to integrate ethical and legal evaluations of how algorithms are used into audits.

In this paper, we present a sociotechnical audit to help external stakeholders evaluate the ethics and legality of police use of facial recognition technology.

We developed this audit for the specific legal context of England and Wales, and to bring attention to broader concerns such as whether police consult affected communities and comply with human rights law.

To design this audit, we compiled ethical and legal standards for governing facial recognition, based on existing literature and feedback from academia, government, civil society, and police organizations. We then applied the resulting audit tool to three facial recognition deployments by police forces in the UK and found that all three failed to meet these standards.

Developing this audit helps us provide insights to researchers in designing their own sociotechnical audits, specifically how audits shift power, how to make audits context-specific, how audits reveal what is not transparent, and how audits lead to accountability.

Authors

Evani Radiya-Dixit

Gina Neff

FAccT ’23

Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency