Submission to Home Office consultation on a new legal framework for law enforcement use of biometrics, facial recognition and similar technologies

Submission to Home Office consultation on a new legal framework for law enforcement use of biometrics, facial recognition and similar technologies


We welcome the opportunity to provide input to the Home Office’s Consultation on a new legal framework for law enforcement use of biometrics, facial recognition and similar technologies.

Despite its proliferation across law enforcement and the private sector the use of facial recognition technologies remains largely unregulated due to a lack of legal framework governing its use. Statewatch’s position is that the use of facial recognition and similar technologies should be prohibited in most cases and otherwise restricted to only very specific circumstances.

Where such technologies are in use, they must be subject to a strict legal framework that allows for proper statutory oversight. Any legal framework must recognise the far-reaching and inherently intrusive effects of facial recognition technology and be adapted accordingly. It is clear that existing regulations on biometrics and DNA are not fit for purpose and do not provide adequate safeguards for the use of facial recognition technology.

Live facial recognition (LFR)

The use of live facial recognition technology effectively inverts the long-held principle of innocent until proven guilty. Through the use of LFR members of the public are unwillingly subject to biometric identity checks which are then cross-checked with custody images held in the Police National Database (PND) to identify potential suspects. The deployment of LFR is problematic on several levels:

  • individuals are often unaware that they are being scanned as part of a process of elimination;
  • it is well known that the millions of custody images held in the PND include large numbers of individuals who have never been charged with a crime;
  • evidence has shown that at present the technology is more than fallible and often leads to biased or discriminatory outcomes.

The cumulative effect of the use of such technology equates to mass surveillance, through which individuals are automatically and unwittingly subject to suspicion. It is a flagrant assault on individual civil liberties and constitutes a disproportionate interference with basic human rights.[1] The use of LFR is fundamentally incompatible with the protection of human rights and, as has been shown, is open to abuse and very often leads to unjust outcomes. The challenges posed by LFR make it clear that its use cannot be effectively regulated and must, therefore, be prohibited.

Retrospective facial recognition technology (RFR)

The use of retrospective facial technology is equally problematic with a concerning lack of transparency[2] having emerged with respect to its use by police. Of particular concern is the use of footage obtained by police at demonstrations and protests which can be held from anywhere between 31 days to 50 years.[3] Such practices and policies pose an explicit threat to the right to peaceful assembly and association, and to freedom of expression.

Other uses of biometric technology

The use of other biometric technologies, in particular, inferential technologies must also be explicitly prohibited.  The margin of error and the invasion on individual privacy is too great to warrant any kind of exception.

In line with many civil rights activists, racial justice and equality groups and technology experts, Statewatch calls for an immediate stop to the use of biometric technologies for law enforcement purposes and urges the British government to reverse its expansion of and reliance on this technology. The far-reaching impact of such technology ultimately means that no legal framework, irrespective of how comprehensively it is conceived, can ever really be adequate to ensure sufficient safeguards and protect against its abuse.

If a decision is taken to expand the use of biometric technologies for law enforcement, we would urge government to ensure that stringent safeguards and limitations on use are provided for in any upcoming legislation. Whilst the EU’s Artificial Intelligence Act may appear tempting as a reference point, Statewatch cautions against modelling a UK legal framework on the provisions of EU legislation that ultimately fail to offer adequate protections for human rights and civil liberties. The law’s supposed safeguards are at best vague and at worst circumventable in a myriad of situations.[4]

At the very least, the rollout of facial recognition technology by the Home Office should be halted until thorough and meaningful data protection and privacy audits of police forces have been carried out. There should not be an automatic presumption that individual police forces can provide or uphold the necessary safeguards and protections. Official reviews have found, and senior officials have admitted, that police forces in England and Wales are in some cases institutionally sexist, racist and homophobic. These findings call for an urgent reassessment of the way policing works, including in relation to the protection of privacy and personal data. Rather than providing the police with new powers and access to new, invasive technologies, the government should be taking meaningful action to ensure that these deep-rooted issues are meaningfully addressed.

The use of facial recognition and similar technologies is, in and of itself, a disproportionate interference with an individual’s rights. Any justification for its use must, therefore, be set at an extremely high threshold. Broad prohibitions must apply to any use of LFR, be in in the private or public sector. Any exceptions or derogations from the prohibitions must equally be applied to all users.

If use is allowed in exceptional cases, it can only be for law enforcement purposes and must be subject to strict regulatory procedures by an independent oversight body. The oversight body must have the power to review, approve or refuse any request for the use of LFR technology as well as to investigate failures of law enforcement or other entities to comply with relevant procedures or uphold legal safeguards.

That body should be required to maintain statistics on the number of requests received from law enforcement bodies, the number and type of decisions made, and details of any investigations carried out. Those numbers and other information on the oversight body’s work should be made public in an annual report, with a presumption given to the maximum disclosure of information on what is a matter of substantial public interest and of great importance for civil liberties. Data obtained through biometric technology is highly sensitive and if acquired must be adequately protected. In this regard, we consider it highly unfortunate that recent changes to data protection law (through the Data Use and Access Act) have weakened previously-existing safeguards on law enforcement use of data[5]. If there is to be a legal framework to regulate law enforcement use of biometric technologies, those changes should be reversed. This is particularly so with regard to the removal of the requirement to maintain logs, the easing of automated decision-making and the introduction of a new national security exemption.

More generally, it is imperative that the ICO be provided with sufficient resources to deal with complaints regarding the use of biometric technologies for law enforcement quickly and effectively.

Notes

[1]

[2]

[3]

[4] See here for more information.

link

Leave a Reply

Your email address will not be published. Required fields are marked *