|
Who better to consult about the personal issues that ChatGPT users ask than the AI chatbot itself? So we inquired. On the personal side, ChatGPT says users most commonly ask about:
For several years, consumers have freely asked such questions, confident in ChatGPT’s promise that it doesn’t retain their queries once deleted. Now, thanks to a pliable magistrate judge in New York, all such queries by hundreds of millions of users will be permanently stored and subject to exposure by discovery for future lawsuits or by official warrants.
Only a few business and education customers are exempt. As for the rest of us, virtually anything asked – no matter how personal – is a permanent record that lawyers in a nasty divorce or commercial dispute, or a government agent, could pry open with the right legal tools. The actual number of users affected is estimated to be 10 percent of the world population. Yet as staggering as the number of affected users is, The Hill contributor and privacy attorney Jay Edelson says the case’s legal implications are of far greater concern: “This is more than a discovery dispute. It’s a mass privacy violation dressed up as routine litigation … If courts accept that any plaintiff can freeze millions of uninvolved users’ data, where does it end? Could Apple preserve every photo taken with an iPhone over one copyright lawsuit? Could Google save a log of every American’s searches over a single business dispute? … “This precedent is terrifying. Now, Americans’ private data could be frozen when a corporate plaintiff simply claims — without proof — that Americans’ deleted content might add marginal value to their case. Today it’s ChatGPT. Tomorrow it could be your cleared browser history or your location data.” Blame not the plaintiff in this case, understandably concerned about the ransacking of its copyrighted material. Blame the judge for ordering such broad discovery. A better approach would have been a randomized sampling of a large number of users’ queries, anonymized to protect their privacy. Users – all of us whose private data is now at risk – were never consulted by the court. Two attempts from private citizens to intervene were smugly dismissed by the judge. Edelson writes: “Maybe you have asked ChatGPT how to handle crippling debt. Maybe you have confessed why you can’t sleep at night. Maybe you’ve typed thoughts you’ve never said out loud. Delete should mean delete.” Let us hope appellate courts replace this magistrate judge’s chainsaw with a scalpel. Comments are closed.
|
Categories
All
|
RSS Feed