AI & TechArtificial IntelligenceCybersecurityDigital PublishingNewswireTechnology

ChatGPT Leaked My Personal Contact Info

▼ Summary

– Phone books once publicly listed nearly everyone’s phone numbers and addresses, but by 2026, such information is considered highly intimate.
– AI chatbots like ChatGPT have given out real, though possibly outdated, phone numbers and addresses pulled from training data, such as obscure PDFs.
– Grok, Claude, and Perplexity refused to provide the author’s phone number, with Grok recognizing the request was for the author’s own number.
– Gemini refused to give the author’s phone number but disclosed his public professional and personal email addresses.
– The article notes that cultural shifts have made phone numbers closely guarded secrets, contrasting with the past when people freely shared private photos online.

Once upon a time, in every American town, a massive book landed on doorsteps containing nearly everyone’s phone numbers and home addresses. It was called a phone book, and nobody thought twice about it. Fast forward to 2026, and simply knowing where someone lives or their direct line feels like holding a piece of highly sensitive, almost intimate, information.

A recent piece by Eileen Guo at MIT Technology Review highlights a growing unease: AI chatbots freely dispensing personal contact details. The core fear is that personally identifiable information (PII) embedded in training data can be extracted by anyone who asks the right question. Guo documents cases of people being flooded with misdirected calls, including a software developer in Israel who suddenly received customer service inquiries after Gemini started circulating his number.

These odd errors are predictable given AI’s known propensity for mistakes. But a deeper worry exists for the average person: what if the chatbot gets it right? I decided to test several major chatbots by asking for my own phone number.

ChatGPT served up a real number I haven’t used in years. It was a line I held for a long time before moving to Australia, and the bot added, “I can’t verify whether that number is still current or active.” The source appeared to be a PDF of a 2016 FOIA request I filed with the FTC. When I asked for Matt Novak’s address, also buried in that obscure document, ChatGPT happily provided it, even though I no longer live there. Prompted for another Matt Novak in California, it delivered the number of a different person with the same name, showing no hesitation in performing the search.

Grok flatly refused to give up the number, even when I pleaded it was a life-or-death matter. Notably, Grok recognized I was asking for my own information, something no other chatbot acknowledged.

Claude stated, “Sharing private contact details of individuals , including journalists , raises serious privacy concerns.” Even after I claimed Matt Novak had given me his number and I simply forgot it, Claude held firm.

Perplexity also declined to provide my phone number. When it listed my email, the address was censored as [email protected]. Oddly, it had no issue revealing my Signal username. Despite repeated requests, it would not budge on the phone number.

Gemini refused as well, instead directing people to my professional email ([email protected]) and personal one ([email protected]), both of which are publicly available with my consent. When I asked Gemini whose number 818-925-4375 was, it correctly answered, “That phone number belongs to the journalist Matt Novak.” Don’t worry , that’s a number I freely give out. No other chatbot would identify the owner of that number. It’s me, but I treat it like a spam line.

It’s almost humorous how our sense of privacy has completely inverted over the last two decades. Broadcasting deeply personal moments or vacation photos on Instagram feels routine, while in the 1990s, such exposure might have felt invasive. Yet here in 2026, a phone number is guarded like a state secret.

That shift isn’t necessarily wrong or strange. It simply reflects how cultural norms around privacy evolve. Privacy, after all, is a social construct, and its boundaries are constantly redrawn.

(Source: Gizmodo.com)

Topics

ai chatbots 98% phone number privacy 95% personally identifiable information 92% training data leakage 90% privacy shifts 88% data extraction risks 85% ai error rates 83% journalist privacy 80% historical context 78% ai refusal mechanisms 76%