Google Chrome Quietly Installed an AI Model on Your Device

▼ Summary
– Google Chrome has been installing its Gemini Nano AI model onto devices without notifying or asking users, as reported by computer scientist Alexander Hanff.
– Gemini Nano performs tasks like scam call detection and text summarization on-device, distinct from Google’s cloud-based AI Mode.
– The model automatically installs only on devices meeting hardware requirements, but will uninstall if resources become insufficient.
– Users can remove Gemini Nano by disabling “Enables optimization guide on device” in Chrome’s flags settings or by uninstalling Chrome entirely.
– Hanff suggests the installation may violate EU data protection laws and could be a cost-saving move by Google to run AI on user hardware rather than its servers.
You may not have signed up for an AI model living on your computer, but there’s a good chance one was installed anyway. Without asking or even notifying users, Google Chrome has been quietly placing a 4GB AI model on certain devices.
According to Alexander Hanff, a Swedish computer scientist and lawyer known as That Privacy Guy, Google has been installing Gemini Nano , an AI model designed to run locally on smartphones and laptops rather than in the cloud , onto some Chrome browsers without obtaining permission. Even after the installation is complete, Google does not inform users that the model is present.
Hanff noted that Gemini Nano only installs on devices meeting specific hardware requirements. It remains unclear how many users have received the unsolicited download.
This on-device AI handles tasks like detecting scam calls, assisting with text message composition, summarizing recordings, and analyzing screenshots on Pixel phones. It should not be confused with the AI Mode feature in the Chrome address bar, which sends queries to Google Gemini servers rather than relying on the local model.
A Google spokesperson told CNET that Gemini Nano will automatically remove itself if the device lacks sufficient resources, including processing power, RAM, storage, or network bandwidth.
“In February, we began rolling out the ability for users to easily turn off and remove the model directly in Chrome settings,” the spokesperson said. “Once disabled, the model will no longer download or update.”
Google also provides additional details about on-device generative AI models in Chrome on a dedicated support page.
If you are using Chrome, you may already have Gemini Nano. To check, open your file manager , File Explorer on Windows, Files on Chromebooks, or Finder on Macs , and search for a folder named OptGuideOnDeviceModel. Inside that folder, you will find a file called weights.bin, which is where the model resides.
Hanff emphasized that Chrome users will not know about Gemini Nano unless they actively search for it, because “Chrome did not ask” and “Chrome does not surface it.”
Removing Gemini Nano requires one of two approaches. You can uninstall Chrome entirely. Alternatively, type chrome://flags into your browser’s address bar, locate the setting labeled “Enables optimization guide on device,” and turn it off.
Why does this matter? Hanff believes the move may be designed to help Google cut costs by shifting AI processing from its own servers to users’ personal computers.
“Running inference on users’ own hardware allows them to push ‘AI features’ without the compute costs,” Hanff told CNET.
Beyond cost savings, Hanff warned of potential legal consequences, particularly in Europe. He suggested that installing Gemini Nano without consent could violate the European Union’s General Data Protection Regulation (GDPR), specifically its principles of lawfulness, fairness, and transparency. He also argued that, given the environmental implications, Google should have disclosed the installation under the Corporate Sustainability Reporting Directive.
“Google has given us every reason not to trust them with a history spanning two decades of global privacy violations at massive scale,” Hanff said. “So, I suspect they figured asking permission (what the law requires) would hinder their ability to push this model and, of course, whatever comes after it.”
(Source: CNET)

