Site icon TechloMedia

Researchers Flag Chrome for Secretly Installing On-Device AI Model

Chrome logo

A new report has raised serious concerns about how Google Chrome is rolling out on-device AI features. According to findings shared by Alexander Hanff, Chrome’s latest update, version 147, has started silently downloading a large AI model file called weights.bin to power Gemini Nano, without clearly informing users or asking for consent.

The file is around 4GB in size and is being installed automatically on supported Windows and macOS systems. What makes this more concerning is that users do not see any notification, consent prompt, or even a clear setting to control it.

Hanff found that Chrome creates a folder named OptGuideOnDeviceModel inside the user profile directory and downloads the model in the background. In his testing, the entire process took just 14 minutes, with no visible sign for the user.

Even if users manually delete the file, Chrome downloads it again. The only way to stop this behaviour right now is by using advanced flags or enterprise-level policies, which most regular users will not even know about.

This approach raises questions. A 4GB download is not small, especially for users on limited data plans. Many users may not even realise why their storage or bandwidth is suddenly being used.

Interestingly, this local AI model is not used for Chrome’s main AI features like the new AI Mode in the address bar. That feature still relies on cloud processing. Instead, Gemini Nano powers smaller on-device features like writing assistance. This means Chrome is pre-loading AI capabilities that many users may never use.

Hanff points out that this is not an isolated case. He compares it to how Anthropic handled its Claude Desktop rollout, where similar patterns were observed.

He highlights several design choices that raise concerns. These include silent installation, no opt-in, difficulty in removal, and automatic reinstallation after deletion. The naming of the folder and files also does not clearly indicate what is being installed.

This behaviour may also raise regulatory issues, especially in regions with strict privacy laws.

Hanff argues that it could conflict with rules under the GDPR and the ePrivacy Directive. These laws require companies to inform users and get consent before storing data or making changes on their devices.

If regulators take a closer look, this could become another case where big tech faces scrutiny over transparency and user control.

This move shows where browsers are heading. Companies want AI features to be ready on-device before users even ask for them. It reduces latency and improves performance, but the way it is being done raises questions. Right now, this feels like a push-first, explain-later approach.

Google may argue that on-device AI improves privacy since data does not leave the device. That is valid to some extent. But silently installing a large model without user awareness creates a trust issue.

A simple opt-in during setup, a clear setting to disable it, or even a visible download notification could have avoided most of this backlash.

Exit mobile version