AI & TechArtificial IntelligenceCybersecurityNewswireTechnology

Sears AI Chatbot Data Exposed in Major Privacy Breach

▼ Summary

– Sears Home Services, a major appliance repair provider, used an AI chatbot named Samantha, but its customer chat logs and audio files were recently found publicly exposed online.
– The exposed databases contained 3.7 million chat logs and 1.4 million audio files from 2024 onward, revealing personal details like names, addresses, and appliance information.
– Security researcher Jeremiah Fowler discovered the unprotected data and alerted Transformco, Sears’ parent company, which then secured the databases, though the exposure duration is unknown.
– The data breach is particularly concerning for phishing and scams, as it includes specific details about customers’ homes and lives that could be exploited.
– Alarmingly, many audio recordings captured hours of ambient sound after calls seemingly ended, potentially recording private conversations without customers’ knowledge.

While the familiar sight of Sears department stores has faded from the American landscape, the company’s appliance repair service remains a major player, embracing modern technology with an AI chatbot named Samantha. New research, however, has revealed a significant privacy breach, where millions of conversations between customers and this AI assistant were left publicly exposed on the internet. This incident underscores the critical data security challenges that can accompany the adoption of new technologies.

Security researcher Jeremiah Fowler discovered three unsecured databases last month containing a vast collection of sensitive customer information from Sears Home Services. The division, which handles millions of appliance repairs annually, had its data openly accessible. The exposed information included an enormous cache of 3.7 million chat logs and 1.4 million associated audio files with text transcripts, all dated from 2024 onward.

The data was remarkably detailed. Chat logs showed the AI introducing itself as “Samantha, an AI virtual voice agent for Sears Home Services,” and even referenced the underlying “kAIros” technology. More alarmingly, these logs and transcripts contained extensive personal details about customers. This included full names, phone numbers, home addresses, specific appliance models owned, and precise details about delivery and repair appointments. The conversations were recorded in both English and Spanish.

Fowler, a researcher with Black Hills Information Security, stresses that this was not anonymized data but real information belonging to real people. He points out that while companies may seek efficiency through AI, they must not cut corners on security. “At the bare minimum, these files should have been password protected and encrypted,” he emphasized.

Upon discovering the open databases in early February, Fowler notified Transformco, the parent company of Sears. The databases were subsequently secured. The duration of the exposure and whether any other parties accessed the data remain unknown. Transformco did not respond to multiple requests for comment regarding the breach.

The exposed data presents serious risks, particularly for sophisticated phishing attacks. With intimate knowledge of a customer’s appliances, contact details, and service history, malicious actors could craft highly convincing warranty scams or other targeted schemes. The personal nature of the information makes it a potent tool for social engineering.

A second, deeply concerning discovery involved the audio recordings. Fowler found that a surprising number of captured calls contained hours of ambient audio recorded after the customer’s conversation with the AI agent had ostensibly ended. Some recordings stretched to four hours in length. It is unclear why customers left their lines open, but these extended sessions potentially captured private household conversations, television audio, and other sensitive background noise the customers never intended to share.

“You could hear the TV playing, you could hear people having conversations, and this recorded all of it,” Fowler noted. This aspect of the breach highlights an additional layer of privacy intrusion, where the collection of data may have far exceeded the reasonable expectations of the individuals involved.

(Source: Wired)

Topics

data breach 95% ai chatbot 90% customer privacy 88% security vulnerability 85% phishing attacks 80% audio recordings 75% appliance repair 75% corporate responsibility 70% data encryption 65% retail decline 65%