Суд скасував апеляцію користувачів проти збереження OpenAI чатів з ChatGPT. Фото: кадр з фільму «Я, робот», Depositphotos
In early June 2025, the court ordered OpenAI «indefinitely» store the history of requests to ChatGPT (including deleted chats) in order to have potential evidence in copyright infringement lawsuits — the decision, quite expectedly, provoked opposition from the company itself and users who filed appeals.
In May, Judge Ona Won, who made this harsh decision, rejected the first request, and now the second one has failed. In the latter, a ChatGPT user named Aidan Hunt claimed that he uses the chatbot «from time to time», sometimes sending OpenAI «highly sensitive personal and commercial information», and that Won’s order to preserve this information creates a «nationwide mass surveillance program» that potentially harms «all ChatGPT users» who did not receive warning that their deleted and anonymous chats would be preserved.
Hunt added that he learned about ChatGPT’s information retention by accident — he came across the news of the court ruling on an online forum.
Corinne McSherry, a lawyer at the Electronic Frontier Foundation’s Digital Rights Group, commented Ars Technica says that users’ fears are not unfounded, and that Vaughn’s ruling could trigger a series of lawsuits around the world once users are properly informed about it.
«The disclosure order itself poses real risks to user privacy and is a precedent for many other lawsuits across the country. And it symbolizes a broader issue: AI-powered chatbots open up another vector for corporate surveillance, especially if users have no real control over what happens to their chat histories and recordings».
Hunt urged the judge to reconsider the order, taking into account anonymous chats with sensitive information and those that include discussions of medical, financial, legal and personal topics that have nothing to do with copyright claims.
Vaughan disagreed with the appeal and emphasized in another decision that she had not exceeded her authority, and her order could not be interpreted as authorizing mass surveillance.
«Petitioner fails to explain how a court-ordered document retention order that directs the segregation and retention of certain private data by a private company for limited litigation purposes is or could be «a “nationwide mass surveillance program,”» Vaughn writes. «It is not. The judiciary is not a law enforcement agency».
Meanwhile, lawyers say that «it is only a matter of time before law enforcement agencies and private plaintiffs start using OpenAI to try to obtain user histories/chat records for various purposes, just as they already do for search history, social media posts, etc.».
On June 26, OpenAI will present its own oral arguments to the judge, although some doubt that the company will be able to properly protect users’ privacy, as it has already failed to «be transparent enough» to provide information that their deleted chats will be stored.