dpas bulletin - DECEMBER 19
Welcome back to our monthly DPAS bulletin, where we cover the latest data protection news, and developments, from all around the world.
Before Christmas comes along, what were the findings of the ICO’s “text pest” investigation? What were the reasons behind NHS Fife receiving a reprimand? And what are the implications behind a Chinese court ruling that AI-generated images can be copyrighted?
So without further delay, let’s explore what’s been going on in the data protection world. Don’t forget to subscribe for these monthly breakdowns, and to visit our website for more news.
Sam Altman reinstated as CEO of OpenAI after short-lived firing
After being let go of OpenAI, Sam Altman (who originally co-founded the company in 2015) has been brought back into the fold after a few days of “internal turmoil” around his dismissal. Altman has now rejoined the company as CEO with a new initial board of Bret Taylor, Larry Summers, and Adam D’Angelo. Altman was fired to begin with due to the board having lost its confidence in him, but due to overwhelming backlash from OpenAI’s workforce, a new agreement was reached.
Read more about this here.
NHS awards US spy tech company £330m contract to create new data platform
In a controversial move, the NHS recently awarded a £330 million contract to Palantir, a US spy tech company, to create an ambitious new data platform. Following this were immediate concerns surrounding the security and privacy of patient data. Palantir has closely worked with military organisations and intelligence agencies worldwide, and its billionaire founder, Peter Thiel, backed Donald Trump in his 2016 election. He also previously stated his opinion that the NHS should be privatised. For reasons such as these, there is an air of mistrust among the public, who are apprehensive about Palantir having access to such large amounts of patient data.
Read more about this here.
ICO releases findings of “text pest” investigation
The ICO recently launched an investigation into customers of services like taxis and food delivery who have received unsolicited messages following the service. Specifically, the ICO wanted to look into workers who have made unwanted advances or propositions towards these customers using the phone number they provided to order their pizza or call a taxi, etc.
Fortunately, the ICO found that the most prominent companies providing delivery services, like Uber Eats and Just Eat, demonstrated good practice when it comes to protecting their customers’ data. For example, a customer’s real phone number will often be hidden from the driver. As well as this, their staff are trained well, and customers are able to easily raise complaints if unwanted contact of any sort should occur.
After receiving around 90 responses (mostly from women), the ICO eventually concluded that the number of people who had experienced this kind of unsolicited contact was thankfully quite small compared to the number of overall customers and employees involved in these services, meaning that these events were not as common as they had feared.
Read more about this investigation here.
Government Regulatory Activity
Beijing Internet Court awards copyright protection to AI-generated images
In a ruling that may surprise many, a court in China has decided that images created using AI (artificial intelligence) possess a level of originality and creativity, and should therefore be recognised as works with copyright protection. This case was introduced when an AI image of a schoolgirl, generated and posted by plaintiff Mr Li, was then used by somebody else (without permission) on their blog. The court sided with Li, ruling that using AI to generate images is essentially like using tools to create, which therefore holds a certain amount of human expression and should be protected by copyright law.
Italy launches investigation into personal data gathering for AI training
Italy’s data protection authority recently began an investigation into how companies gather large amounts of personal data for the purposes of training AI. After briefly banning ChatGPT due to similar concerns in recent months, the authority is now closely assessing whether websites have adequate measures in place to prevent AI platforms from data scraping. AI experts, academics and consumer groups have all been invited to participate in finding information to support the investigation.
ICO publishes updated response to DPDI bill ahead of second House of Lords reading
The Information Commissioner’s Office has published a new response to the Data Protection and Digital Information Bill in the lead-up to its second reading in the House of Lords. The bill includes provisions to reform the Information Commissioner, and would, for example, allow the creation of data bridges with other countries, and enable law enforcement to hold biometric data for longer. It also contains updates to the provisions for the Privacy and Electronic Communications Regulations (PECR), introducing increased fines for nuisance calls and reduced “user content” banners and pop-ups.
The ICO states that they welcome the bill and have had “open and constructive dialogue” with the government throughout its development.
Read the ICO’s updated response here.
EU Data Act becomes law
The Data Act was formally adopted on 27th November 2023 by the Council of the EU, completing the legislative process. The EU Data Act will officially now be enforceable in 2025. This regulation aims to give individuals and businesses better control over their data, and sets rules to make it overall more accessible. For example, it allows people to access and reuse the data generated by their use of products or services.
Read more about the EU Data Act here.
ICO reprimands NHS Fife following incident involving stolen patient data
The Information Commissioner’s Office is urging hospitals to raise their data protection standards after a particularly shocking situation regarding NHS Fife. An unauthorised person was allowed to enter a ward, access the personal data of 14 patients, and take it off the site. Prior to this happening, a member of staff had accidentally turned off the CCTV, meaning that this individual managed to walk away with over a dozen patients’ personal information and has not been identified. The ICO, unsurprisingly, deemed NHS Fife’s security measures inappropriate, and their staff training inadequate.
Read more about this here.
Ministry of Defence fined for Afghan evacuation data breach
On 20th September 2021, the Ministry of Defence emailed Afghan nationals that were eligible for evacuation, but by using the “To” field, meaning that numerous email addresses (and in many cases, profile pictures) were compromised. Sent by the team in charge of the UK’s Afghan Relocations and Assistance Policy (ARAP), this email could have resulted in a threat to life if the inadvertently disclosed personal information had fallen into the hands of the Taliban.
For this mistake, the ICO fined MoD £350,000, stating that this was a “deeply regrettable” data breach that has let down individuals to whom the country owes a great deal.
ICO reprimands Charnwood Borough Council for disclosing domestic abuse victim’s address to her ex-partner
The ICO has called on organisations to be more careful and carry out appropriate procedures, following an incident with Charnwood Borough Council, in which a letter detailing a domestic abuse victim’s new address was sent in error to her previous address which she shared with said partner. It was confirmed that her ex-partner opened and read the letter, posing significant risk to the victim’s wellbeing.
Read more about this incident here.
Former NHS secretary fined for illegally accessing patient data
A former NHS secretary at Worcestershire Acute Hospitals NHS Trust was found guilty last month of unlawfully accessing the personal data of more than 150 patients. Following a complaint from a patient, an investigation found that the secretary had accessed this patient’s data over 30 times without her consent or a business need to do so. Further investigation concluded that this secretary had accessed the records of 156 individuals, and was therefore found guilty and fined a total of £648. The ICO used this opportunity to remind organisations that an ability to access personal data does not automatically mean there is an excuse or reason to do so.
Get in touch with us!
If you need any support in ensuring your organisation is complying with the relevant legislation, or require training in the areas of data protection and information security, get in contact with us.
Either call us on 0203 3013384, email us at firstname.lastname@example.org, or visit our website at www.dataprivacyadvisory.com and fill out a contact form. Our dedicated team will get back to you as soon as possible.