Italy’s data protection agency, Garante, has demanded that OpenAI meet a series of demands before ChatGPT can resume its services in the country. Garante ordered the tech firm to inform Italian users about the methods and logic behind ChatGPT’s data processing system. The agency also demanded that OpenAI give users the tools to request the correction of personal data, including the deletion of data if a correction is not possible. In addition, non-users should be allowed to oppose the processing of their personal data, according to Garante.
Garante has also ordered OpenAI to introduce age verification by the end of September, with no exceptions for users under 13. The company has until the end of April to meet these demands if it wants to resume ChatGPT services in Italy.
The chatbot’s developer, Microsoft-backed OpenAI, took ChatGPT offline almost two weeks ago as Garante launched an investigation into a suspected breach of privacy rules. While the company has pledged to cooperate with the investigation, it remains to be seen whether it will meet Garante’s demands in time for ChatGPT to resume service in Italy.
This move by Garante reflects a growing concern among regulators worldwide about the use of artificial intelligence (AI) and machine learning algorithms, which are often opaque and difficult for users to understand. As AI becomes more prevalent in our daily lives, there is a growing need for transparency and accountability in how these systems process and use personal data.
The European Union has taken a proactive approach to AI regulation with the introduction of the General Data Protection Regulation (GDPR), which requires companies to obtain explicit consent from users before collecting and processing their data. However, there is still a need for regulators to ensure that companies are complying with these regulations and to hold them accountable when they fall short.
The demands made by Garante in Italy are an important step in this direction, as they set a precedent for how AI developers must operate in the country. By requiring transparency, accountability, and user control over personal data, Garante is sending a clear message to OpenAI and other companies that AI must be developed and used in a responsible and ethical manner.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
