Welcome to the Spring edition of the Freeths Data Protection Update.

In this edition, we look at the ICO’s guidance for developers and users of AI, TikTok’s £12.7 million fine for its misuse of children’s personal data; and the prioritisation by the ICO of FOIA and EIR public interest complaints.

“I, GDPR-compliant Robot”: UK Information Commissioner’s Office publishes eight questions everyone should ask themselves about AI and data protection

OVERVIEW:

The emergence of ChatGPT late last year has sparked widespread commentaries and debate about the opportunities and challenges that Artificial Intelligence (“AI”) presents.

Businesses across a wide range of sectors are seeking to embrace the possibilities of this exciting technology. However, there have also been calls to temper this revolution with legislation, or even to pause the development of AI altogether.

This also applies to the world of data protection compliance. For its part, the UK Information Commissioner’s Office (the “ICO”) has taken steps to ensure that organisations using AI do so in a responsible manner that does not conflict with privacy rights and obligations.

In March 2023, the ICO issued its guidance on AI and Data Protection. In doing so, it provided organisations with a “toolkit” to support their compliance with their data protection by design and default obligations when undertaking new projects involving the use of AI.

The ICO has subsequently published a blog setting out eight data compliance questions that developers and users of AI should ask themselves when deploying or developing the technology.  In this article, we take a look at those questions, and the issues around them.

KEY TAKEAWAYS:

We summarise the eight questions as follows:

  • What is your lawful basis for processing personal data? A developer or user will need to identify a lawful basis for processing personal data with AI. This might be (for example) consent or legitimate interests. The organisation will need an extra condition where it processes special category data.
  • Are you a controller, joint controller or a processor? You will need to identify which of these you are, to understand your compliance obligations. Identifying your role might be difficult. The ICO anticipates that a developer will be a controller, but that a user of AI might be a controller, joint controller or a processor.
  • Have you prepared a Data Protection Impact Assessment (“DPIA”)? You will need to assess and mitigate any impacts on privacy rights by conducting a DPIA. You should do this prior to your processing using AI. This will help you comply with your data protection by design and default obligations, in addition to others under UK/EU GDPR.
  • How will you ensure transparency? An organisation will need to make available privacy notices to individuals that cover the AI processing, unless an exemption applies.
  • How will you mitigate security risks? You should consider a range of security risks, from common risks such as leakage of information to “data poisoning”.
  • How will you limit unnecessary processing? An organisation should limit its processing of personal data using AI to that which is necessary for its purposes.
  • How will you comply with individual rights requests? Individuals have various rights under UK/EU GDPR, from accessing their data to requesting its correction or erasure. How will your AI processing deal with these requests?
  • Will you use generative AI to make solely automated decisions? If so, and those decisions have legal or similarly significant effects on the relevant individuals, the affected individuals will have additional rights under UK/EU GDPR.

OUR VIEW

The ICO has taken these steps to support organisations with using AI solutions in a compliant manner. However, it warns that it will act where organisations do not follow data protection law and consider the impact that their AI processing has on individuals.

Before a business develops or uses AI, it should ask itself these eight questions. It may also leverage the other helpful ICO guidance and resources on this topic.

We recommend that organisations take data protection compliance into account from the outset when embarking on processing using AI, taking specialist legal advice where appropriate. In doing so, they will be best placed to exploit the exciting benefits of this technology, whilst mitigating the compliance challenges it might pose. 

TikTok fined by the ICO for failure to protect children’s data

OVERVIEW:

The Information Commissioner’s Officer (ICO) has issued a £12.7 million fine to short-form video sharing platform, TikTok, for breaches of UK data protection law concerning the personal data of 1.4 million UK children.

THE ICO’s FINDINGS:

In 2022, the ICO issued a notice of intent to TikTok, relating to breaches of UK data protection law (in particular, the UK GDPR) that took place between 2018 and 2020 and arose out of various non-compliant activities, including:

  • failing to obtain parent/carer consent for the processing of personal data relating to children under the age of 13,
  • failing to provide information to data subjects (specifically children) that used the platform about how their data was processed, so that they could make “informed choices” about whether and how to engage with the platform, and
  • failing to process the personal data of UK users of the platform, lawfully, fairly and in a transparent manner.

The initial notice of intent set out a proposed sanction of £27 million, but was later significantly reduced following representations by TikTok, as well as the ICO discharging its finding that TikTok had also unlawfully processed special category data.This is not the only investigation TikTok has faced, with the platform coming under scrutiny in a number of jurisdictions, including Ireland, Portugal, the US and South Korea, for data privacy-related incidents.

NEXT STEPS:

Alongside the conclusion of its investigation, the ICO published its ‘Age-Appropriate Design’ Code (Children’s Code) which seeks to protect children within the digital world.

The Code forms fifteen standards, covering key issues concerning data sharing, transparency, profiling, parental controls, and the best interests of the child. It is intended that the Code will operate in tandem with parent/carer guidance when children use and access digital services.

KEY TAKEAWAYS:

The fine, which represents one of the largest that the ICO has issued to a company, stresses:

  • the severity that the ICO takes in relation to the misuse of personal data, and
  • the importance of implementing and maintaining policies, processes, and safeguards (where children are involved) to meet the safety and security requirements in an ever-developing digital landscape.

Indeed, the coming into effect of the Children’s Code also highlights the ICO’s increasing expectations towards companies conforming with data protection standards, with a particular emphasis towards children’s personal data.

If you have any queries or would like further information surrounding the appropriate policies, procedures, and safeguards to put in place where children access your service, please get in touch with our Data Protection team.

FOIA and EIR public interest complaints to be prioritised by the ICO

OVERVIEW:

The ICO are to prioritise complaints made against public authorities under the Freedom of Information Act 2000 (FOIA) and Environmental Information Regulations 2004 (EIRs), which have a significant public interest following their consultation on prioritisation in November 2022.

The decision was announced at the end of March 2023 following the consultation in November, where the ICO have now developed a ‘prioritisation framework’ which will allow them to investigate public interest complaints at pace and could account for up to 10-15% of cases they receive.

In an effort to improve FOI services, public authorities would be required to respond to complaints where there is significant public interest in the information requested in a shorter time-frame to such priority cases if they fall under the criteria.

KEY TAKE-AWAYS:

Here are some key points to note:

  • Criteria includes:
    • The level of interest in the information and whether it raises a novel or high profile issue such as significant media interest or a large amount of public money;
    • Whether the requester supports vulnerable groups or is raising awareness of significant public interest issues such as information rights;
    • If the information affects vulnerable individuals;
    • Whether prioritising the case would have significant operational benefits or support other public authorities.
  • Public authorities will have shorter time periods to respond and will not be consulted on whether the criteria apply to the particular case.
  • ICO caseworkers will ask complainants whether the criteria above apply to their case and their decision will be final.

 

OUR VIEW:

The new prioritisation framework has been put in place by the ICO and public authorities will be required to comply with shorter decision notices received from the ICO in cases of public interest thereby increasing the administrative burden for a speedier response where public authorities may or may not be able to achieve due to various factors. The impacts of these ambitious timescales may take time to fully materialise once the system is in place.

Public authorities should familiarise themselves in relation to the criteria for such cases and ensure they are in a position to respond where relevant in reduced timescales. They should also refer any queries to the ICO whilst the new framework is put in place to better understand their obligations under the FOIA and EIRs to enable them to quickly comply with any requests where required.

There is also consideration being given by the ICO to lowering thresholds for refusing to investigate complaints under section 50(2) of FOIA.

It considers that "undue delay" in making a complaint should be six weeks after the public authority's substantive response, rather than the current three months (section 50(2)(b)).

A "frivolous" complaint will become one where there is such a low public interest in the underlying information that an ICO investigation would require a disproportionate use of resources.

The ICO also proposes not to investigate complaints against a public authority's decision that a request is vexatious under section 14 where the public authority has clearly followed the ICO's guidance on vexatious requests; the complaint itself will be deemed vexatious in accordance with section 50(2)(c).

There may be issues arising in relation to these considerations where investigations are not required for frivolous or vexatious decisions by local authorities, as well as the priority given to complainants to decide whether the criteria apply to their particular case without recourse for public authorities.


If you have any queries on the topics discussed, please get in touch with our Data Protection team.

The content of this page is a summary of the law in force at the date of publication and is not exhaustive, nor does it contain definitive advice. Specialist legal advice should be sought in relation to any queries that may arise.

Get in touch

Contact us today

Whatever your legal needs, our wide ranging expertise is here to support you and your business, so let’s start your legal journey today and get you in touch with the right lawyer to get you started.

Telephone

Get in touch

For general enquiries, please complete this form and we will direct your message to the most appropriate person.