Chrome Caught Silently Installing 4GB Gemini Nano on Users' Devices
A privacy researcher revealed that Chrome has been quietly downloading a 4GB Gemini Nano model to billions of devices without explicit consent, reigniting debate over default-on AI features at browser scale.
Google Chrome has been silently downloading a 4GB Gemini Nano model to user devices without explicit consent, a practice highlighted this week by a privacy researcher writing under the handle "The Privacy Guy" and quickly amplified by 9to5Google, TechSpot, and Malwarebytes. The model is delivered as a weights.bin file inside a folder named OptGuideOnDeviceModel in the Chrome directory, where it powers on-device features such as scam detection, the "Help me write" composition tool, and a Summarizer API exposed to web developers.
The download is gated by hardware requirements but is not surfaced in any first-run prompt. Chrome decides whether the user's machine has enough storage and the right CPU/GPU profile, then fetches and updates the model in the background. Manually deleting the weights.bin file does nothing — Chrome simply re-downloads it on the next opportunity unless the user navigates to Settings > System and toggles off the on-device AI option, an opt-out Google quietly rolled out across platforms in February 2026.
Google's response to the wave of coverage was muted. The company said the practice is not new (Gemini Nano has shipped with Chrome since 2024), emphasized that the model is "automatically deleted if the device's free disk space drops below a certain threshold," and pointed to the February opt-out toggle as evidence that user control exists. What Google did not address is the consent question: whether shipping a 4GB ML model to roughly three billion Chrome users without an install-time prompt constitutes a meaningful disclosure.
The episode also reignited debate about the climate footprint of browser-default AI. Critics pointed out that even at a low duplication factor, pushing a 4GB blob to a billion-device fleet sums to multiple exabytes of network transfer and storage commitment — resources that, unlike cloud-side model serving, the operator never directly accounts for. Defenders countered that on-device inference is the more privacy-preserving and energy-efficient option per query, since features like scam detection avoid round-trips to Google's servers entirely.
For developers, the story is a reminder that on-device AI APIs in Chrome — including the Prompt API and Summarizer API behind origin trials — rely on a model that may or may not be present on a given user's device, and that the toggle to disable it now exists as a real product surface. For users who want their disk space back, the path is: open Chrome settings, search "on-device AI," and switch the feature off. For Google, the takeaway is harder to dismiss: features described as "automatic" by the vendor are now being described as "silent" by the press, and the gap between those framings is widening.