Europe’s new AI Act focuses on creating a regulatory framework for artificial intelligence, classifying AI systems according to their risk level from minimal to unacceptable. High-risk areas include critical infrastructure, employment, essential private services, law enforcement, migration, and justice. These AI systems must meet strict requirements for transparency, data quality, and oversight to mitigate risks. The Act also bans certain AI practices deemed a clear threat to people’s safety, livelihoods, and rights, emphasizing the protection of fundamental EU rights and values.
The AI Act has extraterritorial implications for businesses in the UK . It applies to AI systems provided or used in the EU, regardless of the provider or deployers location. Non-European companies offering AI services in the EU or whose AI systems impact EU citizens must comply with the Act's regulations. This global reach is similar to the GDPR, emphasizing the EU's commitment to setting high standards for technology regulation worldwide.
NHS Lanarkshire Reprimanded for Unauthorised Use of WhatsApp to Share Patient Data On 1 August 2023, the Information Commissioner’s Office (ICO) issued a formal reprimand to NHS Lanarkshire for the unauthorised use of WhatsApp to share patient data.
The ICO found that between April 2020 and April 2022, 26 staff members had access to a WhatsApp group where they shared names, phone numbers, and addresses of patients on more than 500 occasions.
Images and videos containing clinical information were also shared. The WhatsApp group was initially set up for basic communication during the pandemic.
However, it was not approved by NHS Lanarkshire for processing patient data and was used without the organisation's knowledge. A non-staff member was mistakenly added to the group, leading to the disclosure of personal information to an unauthorised individual.
The ICO's investigation concluded that NHS Lanarkshire did not have appropriate policies or clear guidance in place for using WhatsApp. John Edwards, the Information Commissioner, stated, "Patient data is highly sensitive information that must be handled carefully and securely.
There is no excuse for letting data protection standards slip." The ICO recommended several actions for NHS Lanarkshire to ensure compliance with data protection laws, including implementing a secure clinical image transfer system and reviewing all organisational policies relevant to the incident. The ICO has asked for an update on actions taken within six months. You can read a BBC news article here.
On 24 February 2024, the Information Commissioner’s Office (ICO) published guidelines aimed at companies that use biometric recognition systems to process biometric data, as well as developers and vendors of these systems. For biometric data to be regarded as special category data, it must uniquely identify an individual. The use of biometric data to infer certain characteristics of individuals, such as ethnicity, race or health conditions, will also constitute the processing of other special category data. The guidance makes clear that the use of specific biometric recognition software also means that special category biometric data is being processed. Although “biometric recognition” is not defined in the UK GDPR or supplementing legislation, the ICO borrows the definition of this term from the International Standard Organisation (ISO) in ISO/IEC 2382-37:2022(E) which defines the term as “automated recognition of people based on their biological or behavioural characteristics”. The ICO considers this definition to align with the definition of special category biometric data found in the UK GDPR and Data Protection Act 2018.
When biometric recognition software is used, organizations will need to demonstrate privacy by design approach, which includes performing a comprehensive data protection impact assessment (DPIA) on the impact that the use of biometric recognition software will have on the people whose information is processed. ompanies will also need to demonstrate how they adhere to core data protection principles, including accuracy, fairness, and transparency.
To ensure fairness, companies must use biometric information in reasonable ways so that the use meets the data subjects’ expectations. Misleading individuals when obtaining their biometric data is unlawful. Steps must be taken to ensure the statistical accuracy of biometric algorithms; this involves monitoring how well the system performs under given conditions, such as the rate of false biometric acceptance or rejection. The ICO highlights that although biometric recognition systems do not need to be 100% statistically accurate, they must be sufficiently accurate for their intended purposes. It's essential to ensure that outcomes from these systems are understood as statistically informed judgments rather than absolute facts. Taking steps to ensure the accuracy of the systems is crucial as errors could lead to real-world impacts on individuals. which could in turn result in potential legal claims being lodged by individuals against companies using or developing this technology. It also highlights the need to test for bias and discrimination and to have mechanisms in place for error diagnosis and resolution. The ICO highlights that transparency is essential, requiring companies to provide clear information about the use of technology and to consider various methods for disseminating this information effectively. The notice required to be provided to individuals depends on factors such as the relationship with the individuals whose data is being processed, the nature of the processing activities, and the specific use case.
Businesses are urged to consider how people will interact with the technology and the broader context in which it's used to determine the most effective way to communicate information. Tailoring information formats and levels of detail to individuals' knowledge levels may be necessary. The ICO states that some options when providing information include leaflets, digital techniques, staff assistance, visual or audio signals, and online platforms.
Organisations must provide information about the processing purposes, categories of data processed, recipients of the data, and any automated decision-making involved. Information must be provided on how individuals can exercise their rights under the UK GDPR, including the right to access, erasure and rectification. Individuals have the right to access a copy of their personal data, including biometric information. However, the ICO states that this right does not apply to suggested matches from biometric recognition systems, but rather to mislabelled records or decisions based on such matches. Individuals can request the correction of inaccurate or incomplete biometric data. Individuals can also request the deletion of their biometric data, especially if it's no longer necessary for its original purpose or if consent is withdrawn. Organisations may retain data if they are under a legal obligation or if it serves another purpose and the data subject consents to further processing.
We use necessary cookies to make our site work. We would also like your permission to set optional analytics cookies to help us improve it. Clicking 'Accept' below will set cookies on your device to remember your preferences. Find out more in our Privacy Policy or scroll down to read more about the different types of cookies.
Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.
Where you select "Accept" we set Google Analytics cookies to help us to improve our website by collecting and reporting information on how you use it. The cookies collect information in a way that does not directly identify anyone. For more information on how these cookies work see https://developers.google.com/analytics/devguides/collection/analyticsjs/cookie-usage?hl=en-US
We also provide regulated Legal Services