Facial Recognition Technology and Data Protection Law

Julian Hayes and Andrew Watson look at the legal framework around Facial Recognition Technology as set out in the Data Protection Act 2018.

Since the General Data Protection Regulation (‘GDPR’) and the related Data Protection Act 2018 (‘DPA 2018’) came into force, the world of technology has unsurprisingly continued its onward march. Facial recognition technology (‘FRT’) has, of course, been around for quite some time but the imposition of new data protection laws has meant that companies which provide FRT services must take extra care when it comes to the processing of such data.

FRT is ubiquitous. Indeed, FRT is an integral part of law enforcement’s ability to combat criminality, it is a common feature on many smartphones for authentication purposes, and together with other types of data gathered in the course of your social media use, presents a valuable opportunity for targeted advertising. It is also the subject of common misunderstanding. Take for example the Apple iPhone X’s FRT technology which it uses as a form of biometric authentication. It is often assumed that because an Apple product is taking the image, that image will automatically find its way into the Cloud for example. This is true of course for your everyday photographs provided you are connected to the Cloud and consent to them being stored there, but with regards to the image generated by the FRT, it is stored in the phone itself (within the ‘Secure Enclave’ – a separate, inaccessible part of the device’s software) and never leaves.

To dispel consumers’ concerns that companies are capable of collecting personal data in a nefarious or at least unapproved way and thus misusing such data, we now have an effective data protection regime (in Europe at least) that is capable of grappling with this understandably contentious issue.

The images produced as a result of FRT are, for the most part, personal data for the purposes of the governing law; namely data capable of identifying an individual. In general terms once data is identified as personal data, the GDPR and the DPA 2018 provide the framework for how that data can be processed lawfully. In normal circumstances for example, in the processing by a bank of your credit card details, the bank would only be able to process the data, provided it had satisfied at least one of the grounds in Article 6 GDPR. These grounds include, inter alia, consent (of the data subject or the natural person to whom the data relates), compliance with a legal obligation (for example, a Production Order), performing the duties under a contract, or processing for a task carried out in the public interest (arguably a somewhat nebulous ground for lawful processing).

Under the GDPR, however, there are further obligations on individuals and companies when it comes to processing what the GDPR describes as ‘special categories of personal data’. In short, special categories of personal data is data which reveals racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, genetic data, biometric (including dactyloscopic) data for the purpose of uniquely identifying a natural person, and data concerning health, sex life or sexual orientation. As such the data produced by FRT would fall squarely into this category. Under the GDPR, the processing of such data is prohibited absolutely, unless one of the exemptions under Article 9(2) GDPR applies. The list is lengthy and includes a number of exemptions some of which are rather vague, for example where the ‘processing relates to personal data which are manifestly made public by the data subject’. Helpfully (or not so helpfully owing to its Delphic drafting) the DPA 2018, and Schedule 1 in particular, sets out how the provisions of Article 9 GDPR are to be interpreted in the UK. Despite there being manifold exemptions under Article 9, it is more likely that the exemption on which the individual or company will seek to rely, is that of ‘explicit consent’ under Article 9(2)(a).

The wording explicit consent under Article 9, rather than mere consent under Article 6, reinforces the importance the GDPR attaches to special data. Although the GDPR itself does not define explicit consent, it does provide a fuller definition of consent at Article 4(11). In essence consent means any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her. Article 7 provides further instruction to the effect that where processing is based on consent, the controller shall be able to demonstrate that the data subject has consented to processing of his or her personal data. Clearly the documentation of the consent process is what is envisaged here. The corollary is that explicit consent means taking even greater steps to ensure that the data subject is entirely aware of that to which they are consenting.

The law in this area is at times vague and compliance with it often complicated. The importance of safeguarding special category data of which, by its very nature, FRT is some of the most invasive and revealing, means that the penalties for the misuse of FRT (intended or accidental) can be severe. As such it is vital for companies to have in place appropriate policies and training to safeguard against personal data breaches (especially when dealing with special category data) and to ensure that expert advice is sought at an early stage in order to minimise risk or in any event mitigate the fallout of any such misuse.

 

Julian Hayes is a Partner specialising in all aspects of corporate crime and regulatory work. As well as dealing with high profile fraud and corruption matters, including investigations with an international dimension, he has considerable experience of advising corporates on data protection and cybercrime issues.

Andrew Watson is a legal assistant and has been involved in a number of matters concerning HMRC, Trading Standards and the SFO and has a particular interest in relation to cash seizure and forfeiture under the relevant provisions of POCA 2002 and the Criminal Finances Act 2017. Recent data protection work has included advising on the obligations placed on a data controller by the DPA 2018/GDPR when considering whether to comply with a non-mandatory ‘Request for Information’.