CybersecurityNewswireStartupsTechnology

Neon App Shut Down After Major Security Breach Exposes User Data

▼ Summary

– The Neon app rapidly gained popularity by offering users payment for recording their phone calls and selling that audio data to AI companies for model training.
– A severe security flaw allowed any logged-in user to access other users’ phone numbers, call recordings, and transcripts, exposing private data.
– TechCrunch discovered the flaw through testing, which revealed the app’s servers did not properly restrict access to user data.
– After being alerted, the founder took the app offline and notified users of a shutdown for security improvements, but did not disclose the data breach.
– It remains unclear if the app underwent a security review before launch or if any user data was accessed maliciously prior to the fix.

The rapid ascent of the Neon app, which promised users payment for recording their phone calls to sell data to AI firms, has come to an abrupt halt following a major security breach. This incident exposed the private call recordings, transcripts, and phone numbers of its entire user base, leading to the app being taken offline. In just one week, Neon had skyrocketed to become one of the top-five free iPhone applications, amassing thousands of users and tens of thousands of downloads daily.

A critical vulnerability was discovered that allowed any individual logged into the app to access the sensitive data of every other user. The app’s backend servers failed to implement proper access controls, meaning personal information was not securely partitioned. During a technical review, it was found that the system could be manipulated to reveal not just a user’s own call history and earnings, but also the call metadata, audio file links, and text transcripts belonging to anyone else on the platform. This metadata included the phone numbers of both parties involved in a call, the call’s timing, its length, and the amount of money it generated.

The security flaw meant that private conversations were accessible to anyone with a user account. Some of the exposed recordings indicated that users were engaging in lengthy calls, potentially recording conversations with unsus individuals without their knowledge, purely to earn money through the app. Shortly after being notified of the vulnerability, the company’s founder, Alex Kiam, decided to shut down the app’s servers. An email was sent to users announcing a temporary suspension to “add extra layers of security,” but this communication notably omitted any mention of the data exposure or the security lapse that caused it.

It remains uncertain when or if Neon will return to operation and whether app store operators like Apple and Google will take action. This situation echoes other recent incidents where apps with significant security shortcomings have been available for download. For instance, the dating app Tea recently suffered a breach exposing user identity documents, while major platforms like Bumble and Hinge were found to have leaked user location data earlier this year. When questioned, Kiam did not provide details on whether a pre-launch security review was conducted or if the company possesses logs to determine if others exploited the flaw before its discovery. Investors named by Kiam also did not respond to requests for comment.

The exposed transcript from a test call, visible above, demonstrates how the recording feature functioned. The underlying issue was that the servers did not authenticate requests properly, freely serving any user’s data upon request. This incident serves as a stark reminder of the privacy risks associated with applications that monetize personal communications, especially those experiencing explosive growth without a corresponding investment in robust security infrastructure.

(Source: TechCrunch)

Topics

app security 95% Data Privacy 93% server vulnerability 90% call recording 90% ai training 88% data exposure 88% user monetization 85% app shutdown 85% app popularity 82% founder response 80%