Neon App Pays Users to Record Calls & Sells Data to AI

▼ Summary
– The Neon Mobile app pays users to record their phone calls and sells this audio data to AI companies for training machine learning models.
– Despite raising significant privacy and legal concerns, the app has rapidly climbed to become the No. 2 social networking app in Apple’s U.S. App Store.
– The app’s terms grant it a broad, irrevocable license to user recordings, allowing it to use and sell the data in ways that may exceed its marketing claims.
– Legal experts warn the app may use “one-sided transcripts” to circumvent wiretap laws and that the anonymized voice data could still be used for fraud or impersonation.
– The app’s popularity suggests a growing market segment is willing to trade privacy for small financial gain, potentially desensitized by the widespread collection of personal data.
An application that pays people to record their phone calls and then sells that audio to artificial intelligence firms has surprisingly climbed to the number two spot in the U.S. Apple App Store’s Social Networking category. Neon Mobile presents itself as a financial opportunity, promising users they can earn “hundreds or even thousands of dollars per year” simply by granting access to their conversations.
The company’s payment structure offers 30 cents for every minute spent on a call with another Neon user. For calls to anyone else, the daily cap is set at $30. The platform also provides compensation for user referrals. Data from Appfigures reveals a dramatic ascent for the app; it first appeared at rank 476 in its category on September 18th before skyrocketing to the number 10 position and then securing the number two spot among top free social apps on iPhone. At one point on Wednesday, it even broke into the top ten overall apps.
According to its terms of service, the Neon mobile app is capable of capturing both inbound and outbound phone calls. Its marketing, however, insists that only the user’s side of the conversation is recorded unless the call is between two Neon users. The collected data is explicitly sold to “AI companies” to aid in “developing, training, testing, and improving machine learning models, artificial intelligence tools and systems, and related technologies.”
The very existence and high ranking of such an application signals a significant shift in how deeply AI technology has penetrated spaces traditionally considered private. Its popularity suggests a segment of the market is now willing to trade personal privacy for modest financial gain, potentially without full consideration of the broader consequences.
Neon’s privacy policy includes an exceptionally broad license for user data. The company grants itself a “worldwide, exclusive, irrevocable, transferable, royalty-free, fully paid right and license” to sell, use, modify, and distribute user recordings through any media channels, now known or developed in the future. This legal language provides Neon with considerable latitude to use the data in ways that may extend beyond its public claims.
The terms also feature a lengthy section on beta features, which come with no warranties and may contain numerous bugs. While the app raises immediate concerns, its operational model might be technically legal. Jennifer Daniels, a partner at Blank Rome’s Privacy, Security & Data Protection Group, explains that recording only one side of a call is a strategy to circumvent wiretap laws, which in many states require consent from all parties involved.
Peter Jackson, a cybersecurity and privacy attorney at Greenberg Glusker, concurred, noting that the phrasing around “one-sided transcripts” could be interpreted as a loophole. It might imply that the entire call is recorded, with the other party’s audio simply removed from the final transcript provided to the user.
Legal experts also expressed skepticism about the true anonymity of the data. Neon states it removes identifiable information like names, emails, and phone numbers before selling data. However, it does not specify how its AI partners might ultimately use the voice data. This voice data could be exploited to create convincing fake calls or to develop AI voices that sound identical to the user. Jackson warns, “Once your voice is over there, it can be used for fraud… they have recordings of your voice, which could be used to create an impersonation of you.”
A significant concern is that Neon does not disclose the identity of its “trusted partners” or the long-term permissions granted to them. Furthermore, as a company holding valuable voice data, Neon is inherently vulnerable to data breaches, putting all user information at risk.
In a basic test conducted by TechCrunch, the Neon app functioned like a standard VoIP application without providing any audible indication that a call was being recorded. It also did not warn the recipient on the other end of the line. The Caller ID displayed normally. Neon’s founder, Alex Kiam, who is listed as operating the company from a New York apartment in business filings, did not respond to a request for comment. A LinkedIn post suggested Kiam secured funding from Upfront Ventures recently, but the investor also did not reply to an inquiry.
This situation prompts a difficult question: has the proliferation of AI desensitized people to privacy risks? In the past, companies that profited from covert data collection faced public outrage. Today, with AI assistants regularly joining meetings and always-listening devices on the market, the boundaries of consent are shifting. While those scenarios typically involve informed consent, Neon’s model capitalizes on a growing cynicism, if personal data is going to be sold anyway, why not get paid for it?
The danger is that individuals may be sharing far more than they comprehend, and in the process, they are also jeopardizing the privacy of everyone they communicate with. Jackson observes a powerful drive for convenience, especially among knowledge workers, noting that many productivity tools achieve ease of use “at the expense of, obviously, your privacy, but also, increasingly, the privacy of those with whom you are interacting.”
(Source: TechCrunch)





