The incapacity of OpenAI to rectify erroneous data produced by ChatGPT has prompted a complaint from the European data protection advocacy group noyb. The organization claims that OpenAI has violated the General Data Protection Regulation (GDPR) in the European Union by failing to guarantee the accuracy of personal data processed by the service.
“In and of itself, fabricating information is highly troublesome. However, there may be grave repercussions when it comes to inaccurate information on specific people, according to Maartje de Graaf, a data protection attorney at noyb.
It’s evident that businesses are presently unable to ensure that chatbots processing personal data, such as ChatGPT, abide by EU law. A system cannot be utilized to generate data about specific individuals if it is unable to yield transparent and accurate outcomes. The law must be followed by technology, not the other way around.
According to the GDPR, people’s personal data must be accurate. They also have the right to know what data is processed and where it comes from, as well as the right to have inaccurate data corrected. OpenAI has acknowledged, meanwhile, that it is unable to amend inaccurate data produced by ChatGPT or reveal the origins of the data that were used to train the model.
According to OpenAI, “factual accuracy in large language models remains an area of active research.”
As reported by the New York Times, chatbots such as ChatGPT “invent information at least 3 percent of the time – and as high as 27 percent,” according to the advocacy group. Noyb, the complainant, recalls an instance in the case against OpenAI wherein ChatGPT consistently gave the complainant—a public figure—an inaccurate birthdate in spite of pleas for correction.
“OpenAI rejected the complainant’s request to correct or erase the data, arguing that it was not possible to correct data, even though the complainant’s date of birth provided by ChatGPT is incorrect,” noyb wrote.
OpenAI asserted that it could block or filter data on specific prompts, including the name of the complainant, but only if it stopped ChatGPT from filtering all of the person’s personal data. Additionally, the business did not fully address the complainant’s access request, which is a requirement of the GDPR for businesses.
“All organizations have an obligation to cooperate with access requests. De Graaf stated, “It is obviously feasible to maintain records of training data that was utilized to at least have a notion about the sources of information.” “With every ‘innovation,’ it appears that another group of companies believes that its products are exempt from legal requirements.”
The Italian Data Protection Authority placed a temporary ban on OpenAI’s data processing in March 2023 after European privacy watchdogs examined ChatGPT’s errors, and the European Data Protection Board formed a task force to investigate ChatGPT.
Noyb has filed a complaint with the Austrian Data Protection Authority, requesting an investigation into OpenAI’s data processing practices and steps taken to guarantee the accuracy of personal data handled by its massive language models. In addition, the advocacy organization asks the government to impose a fine to guarantee future compliance, force OpenAI to comply with the complainant’s access request, and bring its processing in conformity with the GDPR.