Privacy Partnership Law
Home
Expertise
Industries
Insights
Find people
Privacy Partnership Law
Home
Expertise
Industries
Insights
Find people
More
  • Home
  • Expertise
  • Industries
  • Insights
  • Find people

  • Home
  • Expertise
  • Industries
  • Insights
  • Find people

insights

The UK Parliament has passed the Data (Use and Access) Bill, which now awaits Royal Assent before entering into law. While the Bill encountered delays during its final stages—primarily over contentious provisions related to AI model training and copyright—it ultimately secured cross-party backing.


Though the legislation introduces a range of changes, it stops short of the sweeping reform agenda previously pursued by the Conservative government. The foundations of the UK’s current data protection regime remain largely intact, with the Bill seen more as a recalibration than a departure from the GDPR framework.


One notable structural change is the planned creation of a statutory Information Commission, which will replace the individual role of the Information Commissioner. This move aims to bring greater continuity and institutional resilience, ensuring a more consistent regulatory approach over time—particularly when leadership changes.


Nicola McKilligan, Senior Partner at Privacy Partnership, commented:


“This Bill marks a technical but significant shift. It refines the UK’s approach to areas like automated decision-making, legitimate interests, and cookie rules—providing more clarity for organisations without dismantling core privacy rights. The shift to an Information Commission is especially important, as it supports stability and consistent enforcement as regulatory priorities evolve.”


Industry groups have welcomed clearer guidance on the use of legitimate interests and the extension of soft opt-in provisions to charities. However, critics in the EU have warned that the reforms could erode individual privacy rights and urged the European Commission to scrutinise the UK’s adequacy status when it comes up for review later this year.


1. “We’re just using a general AI tool like ChatGPT/Gemini — do we really need to comply?”

Yes — maybe. If you're using a generative AI in the EU or for EU-facing users, you may have duties under the AI Act, even if you didn’t build or fine-tune the model yourself.

Generative AI applications, like ChatGPT, Claude, Gemini, and Llama, are a type of “General Purpose AI (GPAI)” under the AI Act, and using GPAI systems carries specific obligations in certain contexts.


2. “We didn’t train or fine-tune the model — does that mean we’re not responsible?”

Not quite. If you're deploying the model — e.g. using it or integrating it into a product, chatbot, analytics tool, or customer-facing service — you may still have obligations as a deployer under the AI Act.


3. “Does it matter how we’re using it?”

Yes — context is everything. You have more obligations if the model is:
- Used in high-risk contexts (e.g. hiring, education, law enforcement).
- Used in a user-facing way (e.g. customer support, synthetic media).
- Producing outputs that people rely on or act on.


4. “Do we need to label AI-generated content?”

Often, yes.

If your users interact with the AI (e.g. chatbot) or see its output (e.g. AI-generated images, videos, audio, or text). From 2 August 2026, you may need to:

- Inform users if they’re dealing with an AI chatbot.

- Check that the provider has put a “watermarking” system in place to ensure content can be detected as AI-generated.

- Label the content yourself in certain contexts (e.g. if you’re using generative AI to inform the public about matters of public interest or to produce “deepfakes”, with some exceptions).


5. “What if we just use it internally — like for editing or writing?”

It depends. If no one outside the organisation sees or relies on the AI’s output, transparency duties may not apply — but if it’s used to make important decisions about people (e.g. employee appraisals, job candidates), then risk assessment rules apply. You still need to review documentation from the provider and use the system responsibly.


6. “What documents or controls do we need?”

You should have:

- An “AI literacy” programme to ensure people can use generative AI responsibly
- A copy of the provider’s documentation
- A risk assessment if using it in a sensitive context
- A transparency and labelling policy if people interact with the model or see its output
- A record of compliance showing you reviewed and followed the provider’s usage instructions



Don't Miss out! Subscribe to our newsletters on Linked in

Don't Miss out! Subscribe to our newsletters on Linked in

Don't Miss out! Subscribe to our newsletters on Linked in

Don't Miss out! Subscribe to our newsletters on Linked in

Don't Miss out! Subscribe to our newsletters on Linked in

Don't Miss out! Subscribe to our newsletters on Linked in

Monthly News

May 2025

Download PDF

APRIL 2025

Download PDF

March 2025

Download PDF

February 2025

Download PDF

Livestream

Follow us to watch our regular livestreams

Livestream coming soon

Connect With Us

  • Privacy Policy
  • Legal
  • Contact Us
  • Careers
  • Website Terms
  • Terms of Business
  • Nominated Representatives
  • Complaints
  • About us

Privacy Partnership Law Ltd  is regulated by The Solicitors Regulation Authority with registration number  829686 .  

Privacy Partnership Law Ltd. is a registered company based in England and Wales with a registration number 13211514 - and a registered office at

7 Eland Rd, London Sw11 5JX. VAT number 401788010.  It forms part of the Privacy Partnership Group of Companies.


Copyright © 2025 Privacy Partnership Law Ltd - All Rights Reserved no part of this website may be copied or reproduced without permission.

This website uses cookies.

We use necessary cookies to make our site work. We would  also like your permission  to set optional analytics cookies to help us improve it. Clicking 'Accept' below will set cookies on your device to remember your preferences. Find out more in our Privacy Policy or scroll down to read more about the different types of cookies.

Necessary cookies

Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.

Analytics cookies

Where you select "Accept" we set Google Analytics cookies to help us to improve our website by collecting and reporting information on how you use it. The cookies collect information in a way that does not directly identify anyone. For more information on how these cookies work see https://developers.google.com/analytics/devguides/collection/analyticsjs/cookie-usage?hl=en-US

DeclineAccept