Live Facial Recognition: A Necessary Debate That Demands Real‑World Accountability
News and information from the Advent IM team.
The Home Office’s consultation on a new legal framework for live facial recognition (LFR) and broader biometric technologies is more than another policy exercise, it is, as the Biometrics and Surveillance Camera Commissioner recently described it, a “once‑in‑a‑generation opportunity” to get this right. And getting it right means placing governance, ethics, and public trust at the core, not bolting them on as an afterthought.
For those of us who have spent decades helping organisations navigate the messy reality of security, governance, and risk, one truth remains constant: if you rush the foundations, the structure will fail. LFR is no exception.
Principles First, Not Technology, Not Market Pressure
The Commissioner’s latest response to the consultation reiterates something many of us in the security world have been saying for years: a future‑proof framework must be purpose‑driven, principled, and technology‑agnostic. Regulation that chases specific tools is regulation that will be obsolete before the ink dries. At Advent IM, we see daily how quickly technology evolves, especially when AI is involved. The right approach isn’t to regulate the tool, but the intent, the boundaries, and the accountability mechanisms around its use. When governance focuses on principles like necessity, proportionality, and transparency, it remains relevant regardless of whether tomorrow’s tech looks like today’s or not.
Data Protection Is Not a Box to Tick—It Is the Bedrock
The ICO’s most recent response to the Home Office was crystal clear: data protection law is not something to be replaced, diluted, or worked around. The ICO has repeatedly emphasised that any new legal framework must build on, not supersede, existing GDPR and DPA safeguards, precisely because these safeguards remain essential to preventing misuse, mission creep, and discriminatory outcomes. From our vantage point at Advent IM, supporting organisations across the public and private sectors, we know too well how difficult it can be for agencies to consistently meet even basic information governance obligations. If you can’t maintain a robust DPIA or patch a critical system, you are not ready for high‑risk biometric processing without stringent oversight. This isn’t criticism; it’s reality. And real-world risk management must be grounded in reality.
Oversight: Coherent, Competent, and Independent
The idea of creating a “one‑stop‑shop” biometrics regulator has been floated—one that brings together the roles of the BSCC, ICO, Forensic Science Regulator, and others. Both the BSCC and ICO have warned that doing so poorly would risk fragmentation, contradictory decisions, and a dangerous dilution of expertise. Oversight must be:
Anything less is not oversight, it’s administrative theatre. And theatre doesn’t safeguard rights or build trust.
Public Trust Will Determine Whether LFR Succeeds or Fails
Public trust cannot be mandated; it must be earned. And past deployments of LFR, particularly those with inadequate DPIAs, unclear watchlist governance, or poor transparency, have eroded that trust.
The regulators have been aligned on this: transparency is non‑negotiable.
That means:
Advent IM has long advocated for transparency as a security control in itself. When organisations communicate openly, they reduce uncertainty, build credibility, and counter misinformation. The same applies here, but with far higher stakes.
A Child’s Face Should Never Be an Afterthought
Recent investigative findings and parliamentary scrutiny have highlighted something deeply concerning: children appearing on police LFR watchlists, sometimes without clear justification or consistent safeguards. Regulators, including the BSCC and Children’s Commissioner, have expressed strong concern about the proportionality and necessity of applying such intrusive capabilities to minors. Any framework that fails to protect children robustly is already failing.
Get the Governance Right—or Don’t Deploy at All
LFR and biometric technologies can absolutely support legitimate policing objectives. When deployed lawfully, transparently, and under rigorous oversight, they can provide operational value. No one disputes that.
But value cannot come at the cost of rights, trust, or accountability. The Commissioner’s message is clear: a strong regime must be principled, future‑proof, and restrictive by design, ensuring activities outside the explicit legal framework are prohibited unless later approved with full parliamentary scrutiny. And the ICO’s message is equally clear: existing data protection principles must remain at the heart of the system strengthened, not side-lined. At Advent IM, we echo both positions. Security must enable public safety and protect public rights. The two are not mutually exclusive, they are inseparable.
Conclusion: If the Foundations Aren’t Sound, Don’t Start Building
Introducing powerful technologies into policing requires equally powerful governance. Not vague assurances. Not fragmented oversight. Not retroactive justification If the UK is to embed LFR into mainstream policing, then the public deserves a framework as robust and trustworthy as the technology is potent. Anything less is an unacceptable risk.
By Mike Gillespie, Advent IM