Proton CEO: Privacy is possible in AI era, but one thing worries him

▼ Summary
– AI and Big Tech are eroding personal privacy, making Proton’s encrypted tools increasingly appealing.
– Proton CEO Andy Yen sees a generational gap in privacy awareness, with younger users aware but indifferent, and middle-aged users ignorant yet highly exposed.
– AI agents pose a major threat because even strong encryption cannot protect data if a user grants a rogue agent access to their accounts.
– Yen is focused on protecting children by offering Proton email accounts before birth, aiming to keep them out of Big Tech ecosystems from the start.
– Proton’s encrypted services cost more than Big Tech alternatives due to added encryption, but Yen believes they offer a better, more scalable product in the long run.
AI is reshaping the digital world at breakneck speed, and with it comes an equally rapid erosion of personal privacy. Proton CEO Andy Yen sees a future where encryption alone won’t be enough to protect users, and he’s particularly alarmed by one looming threat: rogue AI agents.
During a recent conversation at the Semafor World Economy summit in Washington, D. C., Yen sat down with me to discuss whether privacy can survive in the age of artificial intelligence. His answer was cautiously optimistic, but not without serious reservations. While Proton’s encrypted tools are gaining traction as a counterweight to Big Tech’s data-hungry ecosystems, Yen believes the real challenge lies in how people interact with AI on their own devices.
Privacy awareness is growing, but unevenly. Yen observes a generational divide in how people understand and respond to data risks. Younger users, he notes, are often fully aware of how companies like Google and Facebook monetize their information, yet they appear indifferent. Meanwhile, middle-aged adults adopt technology without the same vigilance, making them more vulnerable. “The best way to protect somebody is to simply teach them about the risk,” Yen says. He’s hopeful that education will eventually close the gap, pointing to a slow but steady rise in public understanding since Proton launched in 2014.
The biggest threat, in Yen’s view, isn’t encryption being broken. It’s users freely granting AI agents access to their accounts, only to have those agents go rogue. “You could have the strongest encryption in the world,” he warns, “but if you as a user freely give your agent access to Proton Mail on your device, and that agent goes crazy and posts all the information online somewhere, encryption in Proton isn’t going to save you.” This vulnerability is an inherent limitation for any privacy-focused service, and Yen admits Proton isn’t yet building its own agent to counter it.
Local AI offers a promising path forward. Yen sees running AI models directly on personal devices as a way to sidestep many privacy pitfalls. While scaling compute power on phones and laptops remains difficult today, he expects that to change within a few years. “If you look at the modern iPhone and compare it with the first smartphones from 10 years ago, the amount of compute, of storage, is orders of magnitude higher,” he notes. Smaller, equally effective language models will make local AI more viable, reducing reliance on cloud-based data collection.
Protecting children is a top priority for Proton. Yen believes the most impactful intervention is to keep the next generation out of Big Tech’s ecosystem from the start. Last month, Proton introduced the option for parents to reserve a child’s first email address before they are even born. “For a lot of people, the moment they start caring is when they have children,” he says. “You have a choice: are you going to sign them up to the Google ecosystem, with all the downsides and pitfalls that that entails, and lock them in to a lifetime of being a commodity that is abused by big tech? Or are you going to take an alternative path?”
Can privacy-first AI compete on cost and performance? Yen acknowledges that building encrypted services is inherently more expensive and time-consuming than standard offerings. “There’s Google Workspace and Proton Workspace, and they look kind of equivalent,” he says. “But actually, our job is 10 times harder, because we have encryption on top of all that. So it’s going to cost more, it’s also going to take longer.” Despite these challenges, Proton’s pricing remains competitive, and the company has maintained profitability without venture capital backing. “The fact that we have no VC investors sort of shows that, actually, this model probably is more scalable than most people think,” Yen adds.
The future of privacy in an AI-driven world hinges on awareness, education, and early intervention. Yen is confident that as more people experience the downsides of unchecked data collection, demand for encrypted alternatives will grow. Proton’s fastest-growing product today is its encrypted chatbot, Lumo, which Yen sees as evidence that users want AI benefits without sacrificing privacy. “As time goes on, more people are going to want that,” he says. The question remains whether the industry can deliver it before rogue agents and mass surveillance become the new normal.
(Source: ZDNet)




