As the use of biometric recognition technologies becomes more common, concerns about its privacy and safety implications are more pressing than ever.
While many organisations gravitate towards using biometric recognition to improve the slickness of their operations, this can risk falling foul of the UK GDPR, particularly if a robust assessment of the proposed processing is not conducted. Recent guidance and decisions published by the UK Information Commissioner’s Office (ICO) set out the considerations which organisations must bear in mind when preparing to use biometric recognition systems.
Legal restrictions on using biometric data
Organisations processing personal data must always have a lawful basis to do so under Article 6 of the UK GDPR (e.g. contract, legitimate interests, compliance with legal obligation). However, where biometric data is used for the purpose of uniquely identifying someone, one of the following additional conditions (under Article 9 of the UK GDPR) must also be met:
- The data subject has provided explicit, free, and informed consent;
- The data subject has made the personal data manifestly public or;
- The processing is necessary for:
- Employment, social security and social protection (where authorised by law);
- The legitimate interests of not-for-profit bodies;
- Legal claims or judicial acts;
- Protecting vital interests;
- Reasons of public interest in public health (with a basis in law);
- Reasons of substantial public interest (with a basis in law);
- Archiving, research and statistics (with a basis in law); or
- The purposes of health or social care (with a basis in law).
In many cases, the only available legal basis will be consent. This can present some difficulty because if consent is withdrawn, the processing must immediately cease. To rely on consent, organisations must offer a genuine alternative to the processing which does not disadvantage any data subject who refuses or withdraws consent.
Safeguarding the security of biometric data
All personal data must be processed in a manner which ensures appropriate security and protection. This is known as the “integrity and confidentiality” principle under the UK GDPR. For biometric data, this will necessarily require additional layers of protection. ICO guidance sets out the Commissioner’s expectations and recommendations in terms of how to achieve this. For example, recognition systems that use privacy-enhancing technologies (PETs) are recommended. Organisations looking to secure biometric recognition services from an outsourced provider should explore whether the provider can offer that level of security. Even where technical protections like PETs are in place, organisations must still regularly review and test their security measures.
Risk of bias and discrimination
As with all data-driven, automated processes, biometric recognition technologies bring with them a risk of bias. When inherent bias remains unmitigated, it can lead to discrimination. For example:
- Recognition systems which are inaccessible (or less accessible) to those with a physical or other disability;
- Fingerprint recognition which is less accurate for those over 70 or under 12;
- A higher risk of false positives or false negatives relating to a specific group.
Organisations should test all biometric recognition systems for bias and mitigate against that bias to prevent potential discrimination.
Lessons learned from recent ICO decisions
Enforcing acceptable standards regarding the use biometric recognition technology is one of the ICO’s key priorities at present, as demonstrated by two recent decisions:
Chelmer Valley High School began using facial recognition technology in March 2023 for the purpose of cashless catering. No Data Protection Impact Assessment (DPIA) had been conducted or submitted prior to commencing use of the technology. In November 2023, the school’s Data Protection Officer (DPO) submitted a DPIA to the ICO for review.
The school had been relying on assumed consent as the lawful basis for using facial recognition, except where children had been opted out of the process by their parents. This was considered unacceptable for two reasons. Firstly, consent on an opt-out basis is not lawful pursuant to Article 4(11) of UK GDPR, which requires a clear, affirmative action. Secondly, as students were aged between 11 and 18 years old, most could sufficiently provide their own consent. Using a parental opt-out deprived some students of their ability to exercise their rights and freedoms.
The ICO found that the school had not sought advice from its DPO, nor consulted parents or students, prior to the use of the technology. The ICO noted that many of the compliance issues would have been identified beforehand, had the DPO been properly consulted. The ICO also found that the absence of a DPIA prior to the processing demonstrated that the school had failed to properly assess the risks posed to data subjects and how to manage consent lawfully.
In light of the above, the ICO issued a formal reprimand to the school. The ICO also made a number of recommendations, including:
- improved use of DPIAs,
- providing better privacy information to students; and
- reviewing the ICO’s guidance and case study.
Serco and its community leisure trusts had used facial recognition technology and fingerprint scanning to monitor attendance of over 2,000 employees at 38 leisure facilities. Serco introduced the technology as it considered the previous systems (manual sign-in sheets and radio-frequency identification cards) were open to abuse by employees and prone to human error.
The ICO found that Serco Leisure and the trusts had been unlawfully processing biometric data, as they had failed to demonstrate why it was necessary or proportionate to use facial recognition technology or fingerprint scanning, when less intrusive means were available. Serco had not provided any evidence of employees misusing the pre-existing methods for monitoring attendance. They had also failed to proactively offer employees an alternative method of clocking in and out of work. Serco unsuccessfully argued that the processing was necessary for exercising rights conferred by law in connection with employment (one of the special conditions under Article 9 of the UK GDPR). However the ICO held that this condition does not include processing data to meet purely contractual employment obligations or rights.
The ICO issued enforcement notices requiring Serco and the trusts to stop processing biometric data as well as to destroy biometric data that they were not legally obliged to retain.
Practical tips for using biometric recognition
If you are considering implementing biometric recognition technologies, we recommend that you:
- Engage at an early stage with your Data Protection Officer;
- Undertake a DPIA in advance (and again if there are changes to the nature or scope of the processing);
- Consider whether it is necessary and proportionate, or whether there is a less invasive way to achieve the same purpose;
- Ensure the proposed system is free from bias and does not discriminate against individuals;
- Consult with data subjects in advance where possible;
- Offer a genuine alternative to data subjects, which does not place them at a disadvantage;
- If relying on consent, ensure that consent is explicit, informed, and freely given;
- Make arrangements to delete data where consent is withdrawn;
- Ensure that appropriate security arrangements (e.g. PETs) are in place to protect biometric data; and
- Delete biometric data once it is no longer required for the original purpose.
For further information on using biometric recognition or on any other aspect of data privacy or cyber security, please contact our dedicated team of cyber experts.
Written by
Related News, Insights & Events
Risk horizon scan: 2025
January is the optimal time for businesses to review risk registers against management plans and goals for the next 12 months.
Cyber security – looking back on 2024 and what businesses can expect in 2025
2024 was another year in which UK businesses battled to combat cyber security threats, which continue to impact organisations of all sizes across all sectors.
Christmas is coming… and the cyber threat is heightened
The increased cyber risks around the Christmas and New Year period.