Privacy Partnership Law
Home
Services
AI Governance
Data Protection
Insights
Find people
Industries
DPO Services
Training
Subject Access Support
Privacy Partnership Law
Home
Services
AI Governance
Data Protection
Insights
Find people
Industries
DPO Services
Training
Subject Access Support
More
  • Home
  • Services
  • AI Governance
  • Data Protection
  • Insights
  • Find people
  • Industries
  • DPO Services
  • Training
  • Subject Access Support
  • Home
  • Services
  • AI Governance
  • Data Protection
  • Insights
  • Find people
  • Industries
  • DPO Services
  • Training
  • Subject Access Support

Smart Privacy & Safer Artificial Intelligence

Smart Privacy & Safer Artificial Intelligence Smart Privacy & Safer Artificial Intelligence Smart Privacy & Safer Artificial Intelligence Smart Privacy & Safer Artificial Intelligence

Specialist legal advice and compliance solutions for all aspects of Data Protection, AI and Digital Governance

Find a Consultant
Find our more about our new Subject Access support services

Smart Privacy & Safer Artificial Intelligence

Smart Privacy & Safer Artificial Intelligence Smart Privacy & Safer Artificial Intelligence Smart Privacy & Safer Artificial Intelligence Smart Privacy & Safer Artificial Intelligence

Specialist legal advice and compliance solutions for all aspects of Data Protection, AI and Digital Governance

Find a Consultant
Find our more about our new Subject Access support services

Keep up to date with Privacy Partnership Insights and Podcasts

Keep up to date with Privacy Partnership Insights and Podcasts

Keep up to date with Privacy Partnership Insights and Podcasts

Keep up to date with Privacy Partnership Insights and Podcasts

Keep up to date with Privacy Partnership Insights and Podcasts

Keep up to date with Privacy Partnership Insights and Podcasts

AI in recruitment: ICO highlights poor practices as UK overhauls automated decision-making rules

Are your hiring managers quietly letting an algorithm bin hundreds of job applications while claiming a human is technically in charge? This week on the Privacy Partnership Podcast, Rob unpacks a massive structural shift in the UK’s framework for Automated Decision-Making (ADM).

Listen now

Invisible, indestructible signatures: The AI Act’s text watermarking problem

Can you hide an indestructible, imperceptible signature inside a basic marketing blog post? The European Commission seems to think you should try. Following the release of the Second Draft of the Code of Practice on Transparency of AI-Generated Content this Tuesday, Robert Bateman dives into the technical and regulatory headaches awaiting providers of generative AI text systems.

Listen now

1. “We’re just using a general AI tool like ChatGPT/Gemini — do we really need to comply?”

Yes — maybe. If you're using a generative AI in the EU or for EU-facing users, you may have duties under the AI Act, even if you didn’t build or fine-tune the model yourself.

Generative AI applications, like ChatGPT, Claude, Gemini, and Llama, are a type of “General Purpose AI (GPAI)” under the AI Act, and using GPAI systems carries specific obligations in certain contexts.


2. “We didn’t train or fine-tune the model — does that mean we’re not responsible?”

Not quite. If you're deploying the model in the EU— e.g. using it or integrating it into a product, chatbot, analytics tool, or customer-facing service — you may still have obligations as a deployer under the AI Act.


3. “Does it matter how we’re using it?”

Yes — context is everything. You have more obligations if the model is:
- Used in high-risk contexts (e.g. hiring, education, law enforcement).
- Used in a user-facing way (e.g. customer support, synthetic media).
- Producing outputs that people rely on or act on.


4. “Do we need to label AI-generated content?”

Often, yes.

If your users interact with the AI (e.g. chatbot) or see its output (e.g. AI-generated images, videos, audio, or text). From 2 August 2026, you may need to:

- Inform users if they’re dealing with an AI chatbot.

- Check that the provider has put a “watermarking” system in place to ensure content can be detected as AI-generated.

- Label the content yourself in certain contexts (e.g. if you’re using generative AI to inform the public about matters of public interest or to produce “deepfakes”, with some exceptions).

5. “What if we just use it internally — like for editing or writing?”

It depends. If no one outside the organisation sees or relies on the AI’s output, transparency duties may not apply — but if it’s used to make important decisions about people (e.g. employee appraisals, job candidates), then risk assessment rules apply. You still need to review documentation from the provider and use the system responsibly.


6. “What documents or controls may we need?”

You should have:

- An “AI literacy” programme to ensure people can use generative AI responsibly
- A copy of the provider’s documentation
- A risk assessment if using it in a sensitive context
- A transparency and labelling policy if people interact with the model or see its output
- A record of compliance showing you reviewed and followed the provider’s usage instructions



Are you a 'data broker'? Maybe under the EDPB's definition.

Are you a data broker? You might not think so, but Data Protection Regulators could soon be looking at your business model and concluding otherwise.Robert Bateman breaks down a revealing market study commissioned by the Belgian Data Protection Authority through the EDPB’s Support Pool of Experts. Designed to identify and map the data broker ecosystem, the report provides a fascinating look at how the regulatory definition of data brokerage is expanding.

Catch up with our Senior Partner Rob Bateman on Linked in to find out more

Find out more about our Services

#

Artificial Intelligence

#

Data Protection and Privacy

#

DPO Services

Our People

Privacy Partnership Legal Services are provided by Privacy Partnership Law Ltd a regulated  law firm established in England and Wales .

  • Privacy Policy
  • Legal
  • Contact Us
  • Careers
  • Website Terms
  • Terms of Business
  • Nominated Representatives
  • Complaints
  • About us
  • Terms and Conditions

Privacy Partnership Law Ltd  is regulated by The Solicitors Regulation Authority with registration number  829686 .  

Privacy Partnership Law Ltd. is a registered company based in England and Wales with a registration number 13211514 - and a registered office at

7 Eland Rd, London Sw11 5JX. VAT number 401788010.  It forms part of the Privacy Partnership Group of Companies.


Copyright © 2025 Privacy Partnership Law Ltd - All Rights Reserved no part of this website may be copied or reproduced without permission.

This website uses cookies.

We use necessary cookies to make our site work. We would  also like your permission  to set optional analytics cookies to help us improve it. Clicking 'Accept' below will set cookies on your device to remember your preferences. Find out more in our Privacy Policy or scroll down to read more about the different types of cookies.

Necessary cookies

Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.

Analytics cookies

Where you select "Accept" we set Google Analytics cookies to help us to improve our website by collecting and reporting information on how you use it. The cookies collect information in a way that does not directly identify anyone. For more information on how these cookies work see https://developers.google.com/analytics/devguides/collection/analyticsjs/cookie-usage?hl=en-US

DeclineAccept