If you wanted to build a mass surveillance program capable of monitoring 800 million people, where would you start? Ars Technica’s Ashley Belanger found the answer: You order OpenAI’s ChatGPT to indefinitely maintain all of its regular customer chat logs, upending the company’s solemn promise of confidentiality for customers’ prompts and chats. Which is what Ona Wang, U.S. Magistrate Judge for the Southern District of New York, did on May 13. From that date forward, OpenAI has had to keep everything – even users’ deleted chats. All of the rest is now stored “just in case” it’s needed someday. We asked ChatGPT about this, and it told us:
So our lives – health, financial, and professional secrets – are now being stored in AI chats that Judge Wang thinks should be kept on file for any warrant or subpoena, not to mention any Russian or Chinese hacker. Not included in the judge’s order are ChatGPT Enterprise (used by businesses) and Edu data (used by universities). Problem: Many businesses and students use regular ChatGPT without being Enterprise subscribers, including entrepreneur Jason Bramble. He asked the judge to consider the impact of her ruling on – well, you name it – his company’s proprietary workflows, confidential information, trade secrets, competitive strategies, intellectual property, client data, patent applications, trademark requests, source code, and more.
As for the underlying case giving rise to all of this overreach, it involves a copyright infringement lawsuit between OpenAI and the New York Times. It’s a big case, to be sure, but no one saw this coming except for Jason Bramble and one other ChatGPT user, Aidan Hunt. Hunt had learned about the judge’s order from a Reddit forum and decided it was worth fighting on principle. In his motion, he asked the court to vacate the order or at least modify it to exclude highly personal/private content. He politely suggested that Judge Wang was overstepping her bounds because the case “involves important, novel constitutional questions about the privacy rights incident to artificial intelligence usage – a rapidly developing area of law – and the ability of a magistrate to institute a nationwide mass surveillance program by means of a discovery order in a civil case.” Judge Wang’s response was petulant. She noted that Hunt mistakenly used incident when he meant incidental. And then she casually torpedoed two hundred years of judicial review by denying his request with this line: “The judiciary is not a law enforcement agency.” Because, after all, when have judicial decisions ever had executive branch consequences? Judge Wang had denied business owner Jason Bramble’s earlier request on the grounds that he hadn’t hired a lawyer to draft the filing. The magistrate is swatting at flies while asking ChatGPT users to swallow the herd of camels she’s unleashed. Even if a properly narrowed legal hold to preserve evidence relevant to The New York Times’ copyright infringement claim would be appropriate, the judge massively overstepped in ordering ChatGPT to preserve global chat histories. The complaints of Bramble and Hunt, as well as similar pleadings from OpenAI, aim true: The court’s uninformed, over-reaching perspectives ignore the pressing realities of pervasive surveillance of those who accepted the promise that their conversations with ChatGPT were truly private. Judge Wang wondered Hamlet-like whether the data could be anonymized in order to protect users’ privacy. As we’ve written before, and is now commonly understood, government and hackers have the power to deanonymize anonymized data. As MSN points out, the more personal a conversation is, the easier it becomes to identify the user behind it. In declaring that her order is merely about preservation rather than disclosure, Judge Wang is naively declaring “privacy in our time.” As in 1938, we stand at the edge of a gathering storm – this time, not a storm of steel, but of data. What can you do? At the least, you can start minding your Ps and Qs – your prompts and questions. And take to heart that “delete” doesn’t mean what it used to, either. Here's a chronology of Ashley Belanger’s detailed reporting on this story for Ars Technica: June 4 June 6 June 23 June 25 Comments are closed.
|
Categories
All
|