Information Rights Tribunal Overturns Penalty Notice Issued by ICO

Introduction

The UK Information Commissioner has had a penalty notice of £7.5 million imposed on Clearview (CV) – a firm based in Delaware, USA – overturned by the Information Rights Tribunal, in a ruling which will come as an embarrassment to the lawyers at the ICO.

In simple terms, the ICO did not have the powers to issue a penalty notice against Clearview due to the alleged and inferred processing not being in scope for GDPR, as it did not meet the post-Brexit UK GDPR definition of being in territorial scope.

This case, while complex, is a goldmine for scholars over and above the complex material scope issue which is bound in with the interplays between the Brexit Implementation date (IP) UKGDPR and EUGDPR.

This adds some interesting dimension to legal scholarship, in particular:

  • An attempt by the tribunal to define what is behavioural monitoring.
  • The different drafting of the UKGDPR that allowed CV to be declared out of scope.
  • The loss of a case on a legal technicality that an ICO lawyer or external counsel should have identified as a risk of defeat.

The decision is not binding, as the judgement turned on a point of law based upon material scope and Brexit implementation.

Who are Clearview AI?

Clearview AI is a contentious technology firm specialising in facial recognition. They undertake:

  1. Facial Recognition: Clearview AI has developed real-time facial recognition technology using billions of scraped images from websites and social media.
  2. Law Enforcement Focus: The company primarily markets its technology to law enforcement agencies for identifying individuals through its extensive image database.
  3. Privacy Concerns: Clearview AI’s practices have raised significant privacy concerns due to its collection of images without individuals’ consent, leading to one of the world’s largest and most controversial facial recognition databases.
  4. Legal and Ethical Controversies: The company faces legal challenges and backlash from privacy advocates, regulators, and tech firms over its data collection and use of facial recognition technology.

Clearview AI has attracted attention from authorities in various countries, sparking investigations and discussions regarding the regulation of facial recognition technology and protection of individual privacy.

What was the case?

In May 2022, the ICO confirmed a fine, stating that Clearview AI Inc had collected over 20 billion images of people’s faces and data from information (that was publicly available), and created an online database. This information was taken from all over the internet, e.g., social media platforms, without the subjects’ knowledge. They were completely unaware that their data was being collected and used in this manner.

The company provides a service that allows customers (including the police) to upload an image of a person to the company’s app, which is then checked for a match against all the images in the database.

This all followed on from an initial intention to issue a penalty of £17 million in November 2021, which undertook a joint investigation with the Australian Information Commissioner.

The ICO, in its own press release, stated:

Given the high number of UK internet and social media users, Clearview AI Inc’s database is likely to include a substantial amount of data from UK residents, which has been gathered without their knowledge”.

The UK Information Commissioner, John Edwards a New Zealand trained lawyer said at the time:

“Clearview AI Inc has collected multiple images of people all over the world, including in the UK, from a variety of websites and social media platforms, creating a database with more than 20 billion images. The company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable. That is why we have acted to protect people in the UK by both fining the company and issuing an enforcement notice”.

Alas, for John Edwards and his team, the First tier tribunal – led by Judge Griffin – disagreed with the ICO legal eagles and John Edwards, ruling:

  • At the time of processing when the decision was made to issue the notice to fine Clearview, no processing of personal data in the UK was being undertaken and therefore out of the scope for UK and EU GDPR. This linked to the UK GDPR being materially different than the EU GDPR post brexit withdrawal day.  
  • Thus the ICO was acting ultra vires, it did not have the power to sanction CV as it was not in scope for GDPR when issuing the notice.

In legal terms, the Tribunal summed this up in s152:

“As we have pointed out above (in paragraph 97) the UK GDPR is constructed differently and it is Article 3(2A) that removes processing in the course of an activity which fell outside the scope of Union law before IP completion day (Brexit implementation day) from the scope of the Regulation by excluding such processing from the definition of relevant processing in Article 3 UK GDPR.”

Scholars are advised to read s145 onwards for this excellent breakdown of the territorial scope determination by the Tribunal.

What is a First Level tribunal?

In the UK, a first level tribunal is a court of first instance. Scholars would recognise this as equivalent to a County Court in a Civil Case or Magistrates in Criminal Cases. The decisions of Tribunals are not binding on others but act as a persuasive influence. 

A tribunal is chaired by a judge or a trained barrister/solicitor, and supported by two Information Rights law experienced lay members.

The Data Protection Act 2018 allows – under sections 162(1)(c) and (d) DPA 2018 respectively – an appeal to the First Level Tribunal where the ICO has issued an enforcement (EN) or monetary penalty notice(MPN).  Where an appeal concerns a MPN, the appeal can be against the notice or the level of the notice.

At an appeal, there will be an initial evidential burden imposed upon the decision maker who is required to prove that the infringement has taken place. Therefore, the ICO have to prove on the balance of probabilities that in this case, the ICO had the power to issue the notices.

The judgements of the FTT can only appeal to an Upper Tribunal which hears appeals and has the status of a “High Court” on the basis that the Tribunal made a legal mistake.  Considering the consequences of £7.5 million in taxpayers’ cash at risk, the ICO’s legal reputation considering the joint investigation, the Commissioners bullish words, and the costs to date (including the legal team at the Tribunal), an appeal is likely.

Key takeaways from the case

The Tribunal looked at five key areas – a great example of how scholars and Data Protection Law should break down such an issue.

  • There must be processing of personal data. 
  • The personal data must be that of data subjects in the UK. 
  • The processing must be carried out by a controller or processor not established in the UK. 
  • The processing must be “related to” the monitoring of the behaviour of data subjects in the UK as far as their behaviour takes place within the UK. 

The Tribunal and both parties were in broad agreement that all boxes were ticked.

And the final fifth element, the analysis of Article 3 of the UK GDPR:

2) This Regulation applies to the relevant processing of personal data of data subjects who are in the United Kingdom by a controller or processor not established in the United Kingdom where the processing activities are related to: 

(a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the United Kingdom;

A detailed discussion on these points can be found from s70 onwards.

The tribunal undertook some seminal, though not binding but persuasive determination when making a judgement on the behavioural monitoring of data subjects. Section 115 onwards is seminal as it attempts to grapple with this thorny issue, coming up with what was, in my opinion, a reasonable set of criteria to determine the legal term “behavioural”.

The tribunals view that a person’s behaviour in a picture would include:

  • Where they are; 
  • What they are doing – including what they are saying/have said or what they have written as well as their employment or playing of a sport or their pastimes; 
  • Who they associate with in terms of relationships; What they are holding or carrying; 
  • What they are wearing – including any items indicating cultural or religious background or belief. 

The tribunal in s199 indicated that evidence presented supported the view that “behavioural”  also included inference drawn from a picture that showed:

  • relationship status; 
  • parental status; 
  • associates; 
  • location or residence; 
  • use of social media; 
  • habits e.g. whether they smoke/drink alcohol; 
  • occupation or pastime(s); 
  • ability to drive a car; 
  • activity and whether that is legal and; 
  • whether the person has been arrested. 

And in s121, the Tribunal went so far as defining monitoring, which included:

  • Establishing where a person is/was at a particular point in time; 
  • Watching an individual data subject over time by repeated submission of the same Probe Image of a known person; 
  • Using the matched images produced in response to a single search of a Probe Image to provide a narrative about the person in the images at the different times shown in those search results; 
  • Combining these results with information obtained from other forms of monitoring or surveillance. 

We have a court’s first attempt to establish the intention of law makers in determining behavioural monitoring had taken place. The tribunal linking this to the term tracking this providing practitioner with a clear steer.

Concluding thoughts

  • A tricky legal interpretation case for the Tribunal following the drafting of the UK GDPR post-Brexit.
  • An appeal is likely, as this case – and its interpretation – is unhelpful in gaining a clear understanding of the legal position of material scope of UK GDPR.
  • The credibility of the ICO is somewhat tarnished. The case saw considerable financial resources invested in the investigation and subsequent enforcement, plus the loss of £7.5 million of taxpayers cash (This is a UK GDPR issue so the assumption is the UK government now keeps all revenue).
  • A great piece of legal scholarship for Information Rights law scholars and practitioners.

The full determination can be found here.

By Nigel Gooding LLM, FBCS

related posts

Get a Free Consultation