Facial Recognition Technology and the European Court of Human Rights: A rights-based analysis required for State use of this “Highly Intrusive” Technology

Written by: Schona Jolly KC and Dee Masters

Overview

In Glukhin v Russia, the European Court of Human Rights (ECtHR) has considered the Convention implications of States using Facial Recognition Technology (FRT), in an important judgment handed down on 4 July 2023. FRT has become widely used by law enforcement agencies in public spaces across Europe. As yet, although there has been limited judicial consideration of its limits and legitimacy within the UK, there is a growing body of international standards on biometric technologies.  In Glukhin, the Court considered, in particular, the Article 8 impact of FRT in the context of mass surveillance and the Article 10 right to freedom of expression through peaceful protest and relied on those international standards to conclude that the Russian state had breached the Applicant’s Convention rights. Although the Court emphasises that sophisticated technologies,  such as FRT, can be used to radically improve the efficiency of law enforcement agencies, it highlights that “highly intrusive” technology is not exempted from careful scrutiny by the courts; a human rights based analysis must remain front and centre.  The impact of this judgment will be relevant in the UK.

What is FRT?

FRT is a biometric technology that has the aim of “recognising” a person by their face.  The FRT process has been summarised by the European Data Protection Board as follows:

“8. … by taking the image of a face (a photograph or video) called a biometric “sample”, it is possible to extract a digital representation of distinct characteristics of this face (this is called a "template").

9. A biometric template is a digital representation of the unique features that have been extracted from a biometric sample and can be stored in a biometric database. This template is supposed to be unique and specific to each person and it is, in principle, permanent over time. In the recognition phase, the device compares this template with other templates previously produced or calculated directly from biometric samples such as faces found on an image, photo or video. "Facial recognition" is therefore a two-step process: the collection of the facial image and its transformation into a template, followed by the recognition of this face by comparing the corresponding template with one or more other templates.” (Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement, 26 April 2023).

One purpose of FRT is to identify persons of interest to law enforcement agencies.  That is, FRT tools can process thousands of faces in order to predict whether an individual is a probable match to the image of a person of interest.  This functionality explains why it is often referred to as “live FRT” – it can identify in “real time” whether an individual  is likely to be of interest to law enforcement agencies.

How was FRT used in Glukhin v Russia?

In 2019, the applicant travelled to Moscow and conducted a one-person protest in which he held a life-size cardboard figure of a political Mr Kotov (a political figure) holding a banner stating:

“You must be f**king kidding me. I’m Konstantin Kotov. I’m facing up to five years [in prison] … for peaceful protests.” 

Photographs of the protest were taken on social media which were screen shotted and then added to a police database.  The images were used in conjunction with FRT and CCTV surveillance footage to identify the applicant as the man who had staged the protest.  Live FRT was subsequently used to identify the whereabouts of the applicant in order to arrest him. 

What did the ECtHR say about the Article 8 implications of using FRT?

Article 8 ensures that everyone has the right to respect for their private and family life, home and correspondence.  There can be no interference in that right by a public authority unless it is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.

There was no doubt that the use of FRT – which involved the storing and processing of the applicant’s personal data, namely his biometric data for purposes such as identification - engaged Article 8.  The public nature of his protest did not exclude the application of Article 8. The court emphasised that Article 8 protection does not exclude activities taking place in a public context, stating that it “is not limited to an “inner circle” in which the individual may live his or her own personal life without outside interference, but also encompasses the right to lead a “private social life”; that is, the possibility of establishing and developing relationships with others and the outside world. Instead, the Court recognised that there is a zone of interaction of a person with others, even in a public context, which may fall within the scope of “private life”.

Indeed, the Court held that the systematic storage and processing of an individual’s image through FRT fell into that zone where an individual had a reasonable expectation of privacy notwithstanding the public nature of their actions: “Private-life considerations may arise, however, once any systematic or permanent record of such personal data comes into existence, particularly pictures of an identified person. A person’s image constitutes one of the chief attributes of his or her personality, as it reveals the person’s unique characteristics and distinguishes the person from his or her peers. The right of each person to the protection of his or her image is thus one of the essential components of personal development and presupposes the right to control the use of that image”.

Accordingly, the Court found that there was interference with the applicant’s Article 8 right.  The central issue was whether the interference could be justified. 

The court had little hesitation in concluding that the interference could not be justified. In so doing, it emphasised that a rights-based analysis was critical to ensure that sophisticated technology did not undermine the essence of the right afforded by Article 8:

“The protection of personal data is of fundamental importance to a person’s enjoyment of his or her right to respect for private and family life, as guaranteed by Article 8 of the Convention. The domestic law must afford appropriate safeguards to prevent any such use of personal data as may be inconsistent with the guarantees of this Article. The need for such safeguards is all the greater where the protection of personal data undergoing automatic processing is concerned, not least when such data are used for police purposes …, and especially where the technology available is continually becoming more sophisticated …  The protection afforded by Article 8 of the Convention would be unacceptably weakened if the use of modern scientific techniques in the criminal-justice system were allowed at any cost and without carefully balancing the potential benefits of the extensive use of such techniques against important private-life interests …” (para 75, emphasis added)

Moreover, the context of the use of this technology related to the applicant’s attempts to peacefully protest his political opinion. The Court accepted it as “beyond dispute that the fight against crime, and in particular against organised crime and terrorism, which is one of the challenges faced by today’s European societies, depends to a great extent on the use of modern scientific techniques of investigation and identification.”

Notwithstanding that “importance in the detection and investigation of crime”, and although the Court sought to emphasise that it was not making any general ruling on whether the use of FRT was justified, it went on to hold that “the use of highly intrusive facial recognition technology to identify and arrest participants of peaceful protest actions could have a chilling effect in regard of the rights to freedom of expression and assembly.” In doing so, it was placing particular emphasis on the report of the UN High Commissioner for Human Rights: ‘The Impact of New Technologies on the Promotion and Protection of Human Rights’”.

The Court noted that “the use of facial recognition technology to identify the applicant from the photographs and the video published on Telegram – and a fortiori the use of live facial recognition technology to locate and arrest him while he was travelling on the Moscow underground – did not correspond to a “pressing social need”. It therefore  concluded that “the use of highly intrusive facial recognition technology in the context of the applicant exercising his Convention right to freedom of expression is incompatible with the ideals and values of a democratic society governed by the rule of law, which the Convention was designed to maintain and promote.”

What are the implications for Glukhin v Russia in the UK?

FRT is being used in the UK by police forces, see for example, R (Bridges) v The Chief Constable of South Wales Police  [2020] EWCA Civ 1058.

There are relevant lessons for law enforcement agencies in the UK following on from Glukhin v Russia.

Firstly, the substance of an individual’s Convention rights is not downgraded simply because the technology is too sophisticated or valuable to State agencies. New technologies will need to be analysed though a human rights prism, and their efficiency or utility will not provide any carte blanche for complacency.  As the ECtHR memorably highlighted, modern tools cannot be allowed to be used “at any cost”.

It is noteworthy that the Court relied on the growing body of international standards applicable to the use of biometric technologies, such as the Council of Europe Convention 108, as well as data protection standards and those developed by the EU, in particular the GDPR. These are likely to become increasingly relevant as national jurisdictions consider the impact of technology in its public space and life; this judgment places those international standards as an important lens with which to consider the use of technology by State agencies in the UK.

Secondly, as new, and increasingly invasive, technologies are developed, they will need detailed frameworks and safeguards to be applied in order for them to be compatible with Article 8. The Court explicitly set out the need to have “detailed rules governing the scope and application of measures as well as strong safeguards against the risk of abuse and arbitrariness.” It underlined that “the need for safeguards will be all the greater where the use of live facial recognition technology is concerned.”   

This point will resonate with law enforcement agencies in the UK since it was also a theme of Bridges. In that case, the Court of Appeal concluded that the South Wales Police had breached Article 8 when using FRT because of the lack of policies setting out the terms on which the technology could be used i.e. who could use the technology and when (paras 91 to 96).

There are many other rights upon which FRT may engage. In particular, equality and non-discrimination issues are now well-documented, as are accuracy issues which engage other fundamental rights. But that’s a blog for another day…

Cloisters Technology team have been at the forefront of important work examining the interplay between artificial intelligence, technology, equality, human rights, employment law and data protection. Please contact us for further information.

Previous
Previous

Schona Jolly KC to speak at the first Summer School on the Law of the Council of Europe at Liverpool University on 11 July 2023.

Next
Next

Historic win for LGB Alliance