Does it violate EU law? Google Chrome secretly installs 4GB AI models for users, and it will reinstall them even after uninstallation.

Research indicates that Google Chrome covertly downloads 4GB of AI models for users, and even after deletion it will still force a reinstall. The move may violate EU privacy laws, shift massive traffic and environmental costs onto the public, and has been slammed as a “dark pattern” that deprives users of their rights.

Cybersecurity researcher finds that Google Chrome secretly downloads AI models

Renowned cybersecurity researcher Alexander Hanff’s latest report says that the Google Chrome browser, without prior notice and without obtaining consent, secretly downloads about 4GB of terminal AI models onto users’ computers.

To verify the situation, Hanff ran comparative tests on macOS using a brand-new Chrome configuration profile. He used system file system event logs outside of the application to successfully record the precise trail of the file activity.

Automatic installation with no interaction; it still forces a reinstall after deletion

Hanff’s analysis shows that in a zero-interaction background process, Google Chrome will create a model directory on its own and download the full 4GB of data. Chrome writes a file named weights.bin to disk; this is part of Google’s on-device AI system built on the lightweight Gemini Nano model.

The analysis indicates that as long as your computer system meets specific hardware requirements, the download process will automatically start. The entire process completes in just over 14 minutes during seemingly idle browsing time.

Image source: Alexander Hanff report. Alexander Hanff’s latest report states that the Google Chrome browser secretly downloads about 4GB of terminal AI models onto users’ computers.

However, Chrome does not pop up a prompt to explain that several gigabytes of AI models will be stored locally, nor does it offer users an intuitive settings option to prevent the download. Even if users discover and delete the file themselves, the browser will re-download it later unless they go deep into the system to disable experimental features or directly remove Chrome.

He also points to internal Chrome status files as strong supporting evidence. These files show that before downloading, the browser has already proactively assessed the system’s hardware performance and marked the device as qualified for the on-device model. This means Chrome unilaterally decides which devices should receive the model.

Researcher accuses Google Chrome of potentially violating EU law

In addition to disclosing technical details, Hanff also raises legal concerns.

He previously criticized Anthropic’s Claude desktop app as “spyware,” noting that it quietly installs integration bridge components across multiple Chromium-based browsers, even including five browsers he had never installed. Now he has found that Chrome secretly installs AI model files too—this all occurs without any user prompt or substantive disclosure, and after removing the integrated program it will still reinstall.

He argues that the actions of these two companies very likely violate EU regulations, including the EU ePrivacy Directive’s rules on storing data on users’ devices, as well as the General Data Protection Regulation’s requirements for transparency and lawful processing.

Although the researcher’s claims have not yet been ruled on by a court, they already reflect intensifying tension between tech giants pushing new features and the expectations of regulators, especially in Europe.

  • **Related report:**Claude desktop version questioned as “spyware”! Access settings changed without consent—suspected violation of EU law

Google shifts energy use and bandwidth costs onto global users?

Hanff also estimated the environmental cost stemming from Chrome’s covert downloading of the 4GB AI model. If deployed across millions or even billions of devices, he estimates that the total CO₂ equivalent emissions from simply distributing the files could be as high as tens of thousands of tons—nearly equivalent to the annual total emissions of tens of thousands of cars.

Image source: Alexander Hanff report. Alexander Hanff’s research on the environmental impact of Chrome secretly helping users download files

Although the estimate depends on assumptions about scale and energy mix, he clearly states that pushing large binary files to users’ devices comes at extremely high costs, and those costs are externalized onto the environment and the public.

For many users, it may also affect network traffic. In environments with unlimited fiber, a 4GB download might seem trivial, but for users with limited or metered data plans, covertly transmitting several gigabytes of data causes real financial losses. Even in developed markets, users who rely on mobile hotspots or live in remote areas will be affected.

Tech giants move first and sacrifice user rights—dark patterns as the cost

From Hanff’s perspective, both Anthropic and Google chose to act first and leave users to bear the consequences.

Whether it’s covertly registering deep system integrations or downloading several gigabytes of models in the background, the pattern is the same. Users’ devices are treated as deployment targets, stripping away active control—highly consistent with the long-criticized “dark patterns” in software design.

Dark patterns—also called “deceptive design”—are carefully crafted user interfaces intended to mislead or trick users into doing things they would not otherwise choose, benefiting the vendor at the expense of users’ rights.

In the case Hanff alleges, user functions are not only pre-enabled; they are hidden behind obscure settings or implemented in ways that are difficult to remove. His research shows that the trend toward on-device/terminal AI development has not improved the flaws of dark patterns—in fact, it is accelerating these kinds of negative developments.

Further reading:
Is China’s drone giant exposing users’ cybersecurity “naked”? He reverse-engineered Claude to gain worldwide device control—are we still buying AI toys? Bondu leaked 50,000 children’s personal data, while Miiloo instilled this message: Taiwan is part of China.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin