OpenAI’s large language model (LLM) ChatGPT has been hit with a GDPR complaint from Austria’s Data Protection Agency (DPA) and privacy activist group NOYB.
The complaint stated that ChatGPT breaks GDPR by providing false information about public figures and not allowing those figures to access or erase the data it has harvested about them.
LLMs are able to generate text from prompts by ingesting swathes of training data to replicate human-like language. From this training data, they can anticipate the next word in a sentence.
However, LLMs can hallucinate by generating responses that are factually incorrect.
NOYB stated that it had contacted OpenAI about inaccurate responses regarding personal data but stated that the startup answered it was unable to correct ChatGPT’s false responses.
NOYB and DPA’s complaint centres around the case of an unnamed public figure who stated that ChatGPT provided inaccurate information on his birthdate. The complaint stated that no information on his birthdate is available online.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataFollowing ChatGPT’s inaccurate response, the public figure sent an access and erasure request to OpenAI.
The complaint stated that OpenAI’s response to this request only applied to the user data stored on the public figure on his ChatGPT account, not the data that was ingested by ChatGPT.
OpenAI stated that ChatGPT had an option to block displaying all the figure’s personal details, but that it could not solely remove the ingested birthdate.
Data protection lawyer at NOYB Maartje de Graaf stated that AI hallucinations about personal data could have serious consequences.
“It’s clear that companies are currently unable to make chatbots like ChatGPT comply with EU law, when processing data about individuals. If a system cannot produce accurate and transparent results, it cannot be used to generate data about individuals. The technology has to follow the legal requirements, not the other way around,” said de Graaf.
De Graaf stated that GDPR compliance must pertain to all companies.
“It is clearly possible to keep records of training data that was used at least have an idea about the sources of information. It seems that with each ‘innovation’, another group of companies thinks that its products don’t have to comply with the law,” she said.