Facial Recognition and Data Protection: What you need to know

The prevalence of facial recognition technology (FRT) is rapidly increasing. There are said to be over one billion surveillance cameras in operation globally, of which many now have FRT capability, but many may be left wondering: what exactly is FRT and what are the data protection implications of such technology?

If you use, or are planning to use, technology like this in your organisation for any purpose, it is crucial that you understand how to implement it ethically, safely, responsibly, and legally. Failing to do so means the privacy, autonomy and dignity of those whose data you process could be put at risk, and your organisation could be on the receiving end of regulatory or other legal action.

What is facial recognition technology?

In an opinion piece on the use of FRT in public places, The Information Commissioner’s Office (ICO) defined facial recognition technology (FRT) as the process by which a person can be identified or otherwise recognised from a digital facial image. Cameras are used to capture these images and FRT software produces a biometric template. Often, the system will then estimate the degree of similarity between two facial templates to identify a match or to place a template in a particular category.

Biometric recognition and special category data

The ICO has also recently published some guidance surrounding the use of biometric data and biometric recognition. This outlines that biometric recognition, as defined by the International Standards Organisation (ISO) in ISO/IEC 2382-37:2022(E), refers to the automated recognition of people based on their biological or behavioural characteristics. This aligns closely with the definition of special category biometric data in the UK GDPR. Biometric recognition could also be used to describe a recording of someone talking, or a video of them walking.

Due to the nature of FRT, it relies on the use of people’s personal and biometric data, and therefore falls under the Article 9(1) category of special category data: “biometric data for the purpose of uniquely identifying a natural person”. This in turn means that there are extra protections afforded to the data used, and organisations who wish to utilise FRT need to ensure they meet their compliance obligations. The importance of understanding these obligations was highlighted in the recent ICO investigation, where it was found that Serco had been unlawfully processing the biometric data of more than 2,000 of its employees, after failing to evidence why it was necessary or proportionate to use FRT for the purpose of employee attendance checks.

Important considerations regarding the use of FRT

What does the law say?

As mentioned above, biometric data is considered ‘special category data’ and as such, special protections are afforded to its processing. Article 9 of the UK GDPR prohibits the processing of special category data unless one or more of the following 10 exceptions, usually referred to as ‘conditions for processing special category data’, apply:

(a) Explicit consent

(b) Employment, social security and social protection (if authorised by law)

(c) Vital interests

(d) Not-for-profit bodies

(e) Made public by the data subject

(f) Legal claims or judicial acts

(g) Reasons of substantial public interest (with a basis in law)

(h) Health or social care (with a basis in law)

(i) Public health (with a basis in law)

(j) Archiving, research and statistics (with a basis in law)


Once clear on the lawful basis for processing biometric data, it is important to ensure all the other legal compliance requirements, including data subject rights, are carefully considered. Find out more about your data protection obligations here.

Steps to take BEFORE processing FRT data

  • Get your DPO involved – from the beginning.
  • Conduct a thorough Data Protection Impact Assessment
  • Understand and document your data protection compliance 
  • Be transparent and accountable with all your processing
  • Understand and mitigate against the weaknesses and vulnerabilities of FRT. Like any technology, it won’t get everything right. The greater the potential impact of that, the greater effort you must put into harm prevention. And be aware of unintended consequences.


Additional considerations

Before implementing any FRT, it is important to understand what ‘thresholds’ are being used. A threshold is the point at which the system or technology considers the similarity to be statistically significant. The lower the threshold, the greater the chance that the match does not relate to the same person. This creates its own sets of risks, as false identification (rejection) can result in harms to people, such as unauthorised access to sensitive information or denying access to services or opportunities. For this reason, if you do utilise FRT, the system should be monitored to assess its performance and how accurately the matches made are.

It is also crucial to understand whether there is any bias in the FRT, and you must make sure that your system does not result in discrimination. The use of FRT that results in discrimination not only has data protection implications but would breach the right to non-discrimination under Article 14 of the Human Rights Act.

Are you thinking about using this technology in your organisation? 

To achieve peace of mind if considering this technology, think about undertaking a thorough data protection audit. You can use your own DPO or consider the services that we offer in this area. 

Don’t simply assume your organisation is compliant, especially when considering FRT. Click here to read more about our audit services.

If you would like support from the DPAS team either give us a call on 0203 3013384 or send us an email at info@dataprivacyadvisory.com – or fill in a contact form and we’ll get in touch with you.

related posts

Get a Free Consultation