The new UK government remains committed to introducing data-related laws, despite the DPDI Bill not passing in the parliamentary wash-up session.
The King’s Speech outlined several new legislative plans for the next parliamentary session, including:
Although a new AI bill was not introduced, the government plans to develop AI regulations, focusing on overseeing the development of advanced AI models.
Businesses will need to await further details to understand the impact of these prospective laws and closely monitor any legislative changes which impact their compliance obligations.
On May 21, 2024, the Information Commissioner’s Office (ICO) concluded its investigation into Snap Inc.'s ‘My AI’ chatbot. The investigation was initiated following concerns that Snap had failed to meet its legal obligation to assess the data protection risks associated with the chatbot. The ICO's investigation revealed that Snap had not conducted a thorough data protection impact assessment (DPIA) as required under the UK GDPR.
As a result of the investigation, Snap implemented appropriate measures to mitigate the identified risks and ensure compliance with data protection laws. These measures included conducting a comprehensive DPIA, enhancing data security protocols, and improving transparency with users about how their data is processed.
The ICO used this opportunity to remind organisations of the critical importance of conducting DPIAs before launching AI products. The ICO emphasised that failure to comply with these requirements can result in substantial fines and other enforcement actions.
This case highlights the increasing necessity for companies to prioritise data protection compliance when developing and deploying AI technologies.
The European Data Protection Board (EDPB) has issued a long-awaited opinion concerning valid consent in relation to "consent or pay" models used by large online platforms (LOPs) for behavioural advertising. In short, the EDPB have cast doubt on the consent or pay model as a valid method for obtaining user consent under GDPR. This opinion could have huge implications for businesses utilising behavioural advertising in Europe.
Whilst the EDBP have criticised consent or pay, there may be alternative approaches to provide users with more of a genuine choice, such as:
These alternative approaches may serve to address concerns with the consent or pay model.
The opinion followed its request from the Dutch, Norwegian, and Hamburg data protection authorities regarding the implementation of pay or consent models for behavioural advertising by LOPs and valid consent requirements under the GDPR. The EDPB emphasised that when users are presented with an option of either consenting to the processing of their personal data for behavioural advertising or paying a fee, it is generally not possible for LOPs to comply with the valid consent standards outlined in the GDPR. Further, the EDPB stated that the fundamental right to data protection should not be a tradeable commodity that individuals need to pay for. The EDPB stressed the importance of users having genuine choice. Consent cannot be deemed to be freely given if a user faces detriment, by either withholding or withdrawing consent.
The legality around consent or pay models has been strongly debated, and the EDPB’s opinion raises significant concerns. The EDPB directed that offering a paid alternative alone is not sufficient; platforms should also provide a free alternative without behavioural advertising to ensure fairness and compliance with GDPR principles. Controllers must also consider factors such as necessity, proportionality, and the imbalance of power between users and platforms.
This critical opinion holds significant implications for apps offering a "go premium, no ads" model and serves as a significant reminder for online platforms to reassess their strategies concerning user consent and data processing. The EDPB’s position highlights the urgency for platforms to align their practices with GDPR standards to maintain compliance and uphold user privacy rights effectively. While the opinion is not yet settled nor binding, it reflects how European regulators may tackle this issue and the risks inherent to the consent or pay model. The UK Information Commissioner’s Office has also recently launched a consultation on this topic, and companies should stay informed about these developments and consider the potential impacts on their business models.
On March 6, 2024, the Information Commissioner’s Office (ICO) initiated 'call for views' concerning its approach to a proposed 'consent or pay' model for online advertising cookies. This model offers users the option to consent to personalised advertising or pay a fee to opt out of tracking. The ICO has communicated its standpoint to the Association of Online Publishers and Internet Advertising Bureau UK and now seeks input from stakeholders on this model.
This initiative follows existing ICO enforcement efforts against non-compliant websites regarding cookie consent mechanisms, emphasising the importance of providing users with a fair choice. Through the development of digital tools and collaboration with technical experts, the ICO is increasing its efforts to assess website cookie compliance. Organisations found to be non-compliant may face enforcement action, as the ICO reviews responses received and prioritises cases for enforcement action. This announcement serves as a final warning for organisations to ensure compliance with cookie regulations.
NHS Lanarkshire Reprimanded for Unauthorised Use of WhatsApp to Share Patient Data On 1 August 2023, the Information Commissioner’s Office (ICO) issued a formal reprimand to NHS Lanarkshire for the unauthorised use of WhatsApp to share patient data.
The ICO found that between April 2020 and April 2022, 26 staff members had access to a WhatsApp group where they shared names, phone numbers, and addresses of patients on more than 500 occasions.
Images and videos containing clinical information were also shared. The WhatsApp group was initially set up for basic communication during the pandemic.
However, it was not approved by NHS Lanarkshire for processing patient data and was used without the organisation's knowledge. A non-staff member was mistakenly added to the group, leading to the disclosure of personal information to an unauthorised individual.
The ICO's investigation concluded that NHS Lanarkshire did not have appropriate policies or clear guidance in place for using WhatsApp. John Edwards, the Information Commissioner, stated, "Patient data is highly sensitive information that must be handled carefully and securely.
There is no excuse for letting data protection standards slip." The ICO recommended several actions for NHS Lanarkshire to ensure compliance with data protection laws, including implementing a secure clinical image transfer system and reviewing all organisational policies relevant to the incident. The ICO has asked for an update on actions taken within six months. You can read a BBC news article here.
On 24 February 2024, the Information Commissioner’s Office (ICO) published guidelines aimed at companies that use biometric recognition systems to process biometric data, as well as developers and vendors of these systems. For biometric data to be regarded as special category data, it must uniquely identify an individual. The use of biometric data to infer certain characteristics of individuals, such as ethnicity, race or health conditions, will also constitute the processing of other special category data. The guidance makes clear that the use of specific biometric recognition software also means that special category biometric data is being processed. Although “biometric recognition” is not defined in the UK GDPR or supplementing legislation, the ICO borrows the definition of this term from the International Standard Organisation (ISO) in ISO/IEC 2382-37:2022(E) which defines the term as “automated recognition of people based on their biological or behavioural characteristics”. The ICO considers this definition to align with the definition of special category biometric data found in the UK GDPR and Data Protection Act 2018.
When biometric recognition software is used, organizations will need to demonstrate privacy by design approach, which includes performing a comprehensive data protection impact assessment (DPIA) on the impact that the use of biometric recognition software will have on the people whose information is processed. ompanies will also need to demonstrate how they adhere to core data protection principles, including accuracy, fairness, and transparency.
To ensure fairness, companies must use biometric information in reasonable ways so that the use meets the data subjects’ expectations. Misleading individuals when obtaining their biometric data is unlawful. Steps must be taken to ensure the statistical accuracy of biometric algorithms; this involves monitoring how well the system performs under given conditions, such as the rate of false biometric acceptance or rejection. The ICO highlights that although biometric recognition systems do not need to be 100% statistically accurate, they must be sufficiently accurate for their intended purposes. It's essential to ensure that outcomes from these systems are understood as statistically informed judgments rather than absolute facts. Taking steps to ensure the accuracy of the systems is crucial as errors could lead to real-world impacts on individuals. which could in turn result in potential legal claims being lodged by individuals against companies using or developing this technology. It also highlights the need to test for bias and discrimination and to have mechanisms in place for error diagnosis and resolution. The ICO highlights that transparency is essential, requiring companies to provide clear information about the use of technology and to consider various methods for disseminating this information effectively. The notice required to be provided to individuals depends on factors such as the relationship with the individuals whose data is being processed, the nature of the processing activities, and the specific use case.
Businesses are urged to consider how people will interact with the technology and the broader context in which it's used to determine the most effective way to communicate information. Tailoring information formats and levels of detail to individuals' knowledge levels may be necessary. The ICO states that some options when providing information include leaflets, digital techniques, staff assistance, visual or audio signals, and online platforms.
Organisations must provide information about the processing purposes, categories of data processed, recipients of the data, and any automated decision-making involved. Information must be provided on how individuals can exercise their rights under the UK GDPR, including the right to access, erasure and rectification. Individuals have the right to access a copy of their personal data, including biometric information. However, the ICO states that this right does not apply to suggested matches from biometric recognition systems, but rather to mislabelled records or decisions based on such matches. Individuals can request the correction of inaccurate or incomplete biometric data. Individuals can also request the deletion of their biometric data, especially if it's no longer necessary for its original purpose or if consent is withdrawn. Organisations may retain data if they are under a legal obligation or if it serves another purpose and the data subject consents to further processing.
We use necessary cookies to make our site work. We would also like your permission to set optional analytics cookies to help us improve it. Clicking 'Accept' below will set cookies on your device to remember your preferences. Find out more in our Privacy Policy or scroll down to read more about the different types of cookies.
Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.
Where you select "Accept" we set Google Analytics cookies to help us to improve our website by collecting and reporting information on how you use it. The cookies collect information in a way that does not directly identify anyone. For more information on how these cookies work see https://developers.google.com/analytics/devguides/collection/analyticsjs/cookie-usage?hl=en-US
We also provide regulated Legal Services