GenAI browser tools found collecting private data during web use, new study warns

Researchers at UC Davis, UCL, University of California, and Mediterranea University warn that AI browser extensions are transmitting personal details, including health information and social security numbers.

A new study by researchers at the University of California, UC Davis, University College London (UCL), and Mediterranea University of Reggio Calabria finds that popular generative AI (GenAI) browser assistants pose significant risks to user privacy.

The work, presented at the 2025 USENIX Security Symposium, highlights cases where personal data, including social security numbers and health records, was transmitted to external servers without adequate safeguards.

GenAI browser assistants are installed as extensions and marketed as tools to enhance online activity by summarizing webpages, translating text, and offering personalized search support. The researchers tested ten widely used assistants, including Merlin, Sider, Perplexity, Monica, ChatGPT for Google, TinaMind, MaxAI, Copilot, and HARPA.AI. While designed to provide convenience, the study reveals that several extensions capture far more information than users expect, with Merlin observed collecting a social security number entered on an IRS form.

Testing everyday browsing

To understand how these assistants handle data, the researchers simulated real browsing activity using both public and private sites. Scenarios included shopping, reading news, logging into health portals, and accessing university records. They also created a specific persona, a wealthy millennial male from Southern California with equestrian interests, to test whether assistants retained and acted upon leaked attributes.

Yash Vekaria, a Ph.D. candidate in computer science at UC Davis and lead author of the study, explains: “Users should understand that when they are using assistants in a private space, their information is being collected.”

The results showed notable differences across tools. Perplexity relied on server-side requests that prevented collection of private data, making it the only assistant in the study not to profile users. Others, including Monica, Sider, ChatGPT for Google, and Copilot, demonstrated the ability to remember browsing patterns and generate personalized outputs that reflected inferred details such as wealth or lifestyle.

Dr. Anna Maria Mandalari, senior author at UCL, comments: “Though many people are aware that search engines and social media platforms collect information about them for targeted advertising, these AI browser assistants operate with unprecedented access to users’ online behavior in areas of their online life that should remain private.”

Third-party sharing and legal implications

The study also tracks how collected data is shared. Merlin and TinaMind transmitted queries to Google Analytics, opening the door to cross-site tracking. HARPA.AI and MaxAI sent information to Mixpanel, a service that reproduces on-screen activity such as cursor movement through session replays.

These practices risk breaching privacy protections in the United States, including the Health Insurance Portability and Accountability Act (HIPAA) and the Family Educational Rights and Privacy Act (FERPA). Although the research did not test compliance with UK or EU frameworks, the authors suggest that similar activity would likely fall foul of GDPR.

The researchers recommend improvements ranging from labeling in extension stores to on-device processing that reduces the need for data transfers. They also call on regulators, including the Federal Trade Commission, to enforce stronger privacy requirements.

Dr. Aurelio Canino, a co-author from UCL and Mediterranea University, says: “As generative AI becomes more embedded in our digital lives, we must ensure that privacy is not sacrificed for convenience. Our work lays the foundation for future regulation and transparency in this rapidly evolving space.”

Previous
Previous

GCSE results 2025 in England and Wales show rise in top grades, fall in entries, and shifts across subjects

Next
Next

Startups launched by Cambridge University students get new support from King’s College’s first incubator program