Google markets its Chrome browser by citing its superior safety features, but according to privacy consultant Alexander Hanff, Chrome does not protect against browser fingerprinting – a method of tracking people online by capturing technical details about their browser.
"There are at least thirty distinct fingerprinting techniques that work in Chrome right now, today, as you read this," wrote Hanff, an occasional contributor to The Register, in a recently published critique of Google's browser.
"Not theoretical attacks from academic papers that might work under laboratory conditions – real, production techniques deployed on millions of websites to identify and track you without your knowledge or consent."
Visitors to websites can leave behind a browser fingerprint that records the OS they are running, their screen resolution, and installed fonts. That info is carried in traffic from a browser to a web server, or information available to the server or third parties through page scripts and tracking elements. Browser fingerprints can become unique identifiers.
About a decade ago, Apple, Mozilla, and other privacy-oriented browser makers began implementing more effective defenses against cookie-based tracking. That led advertisers toward browser fingerprinting, which is more difficult to block than cookie-based tracking. The technique is also used in fraud detection.
According to a 2021 research paper, "Fingerprinting the Fingerprinters: Learning to Detect Browser Fingerprinting Behaviors," browser fingerprinting was found "on more than 10 percent of the top-100K websites and over a quarter of the top-10K websites."
While browser fingerprinting may have valid uses, it poses a significant privacy risk. And fingerprinting of this sort need not include much technical information at all. A study published in Nature last October found that just knowing the four websites an individual visits the most – a behavioral fingerprint as opposed to a browser fingerprint – is enough to identify 95 percent of people.
In 2019, Google announced its Privacy Sandbox initiative "to develop a set of open standards to fundamentally enhance privacy on the web," including by smudging browser fingerprints.
The company blamed the rise of fingerprinting on efforts to block third-party cookies and proposed its own privacy-preserving technology as an alternative to Apple's App Tracking Transparency scheme and similar defenses against cookies.
"First, large scale blocking of cookies undermines people's privacy by encouraging opaque techniques such as fingerprinting," the company wrote at the time. "With fingerprinting, developers have found ways to use tiny bits of information that vary between users, such as what device they have or what fonts they have installed to generate a unique identifier which can then be used to match a user across websites. Unlike cookies, users cannot clear their fingerprint, and therefore cannot control how their information is collected. We think this subverts user choice and is wrong."
But after six years of stumbles, industry suspicion, and lobbying, Google gave up on its Privacy Sandbox. This came just months after the company changed its position from "digital fingerprinting is wrong" to "digital fingerprinting is okay if it's disclosed", back in Dec. 2024.
Such concerns may seem quaint at a time when people allow AI agents to rifle through their files and share sensitive details with chatbots and third-party AI applications. But they're of real consequence.
A recently published report by Citizen Lab details how ad-based surveillance data is sold to government and law enforcement organizations around the world. One of the surveillance products described "’automatically extract[s] available information from target connections' including IP address, browser type, language, version and plugins, operating system and version, device type, CPU and GPU information, screen resolution, ISP information, estimated geolocation, user inputs, timezone, battery level and charging status." It conducts device fingerprinting, among other types of surveillance.
"Chrome ships almost no built-in anti-fingerprinting defenses," said Hanff. "Let me say that again because it matters – Google's browser, the most popular browser in the world, does essentially nothing to prevent websites from building a unique profile of your device.”
He points out that other browsers do have fingerprinting protections.
“Brave has farbling. Firefox has privacy.resistFingerprinting. Chrome has nothing. Google's Privacy Sandbox was discontinued in April 2025 without shipping a single fingerprinting-specific mitigation."
In the remainder of his post, Hanff goes on to enumerate the gaps in Chrome's fingerprinting defenses. These include technologies like: Canvas, WebGL, WebGPU, AudioContext, Fonts, navigation and screen properties, WebRTC IP leakage, TLS, emoji rendering, speech synthesis, keyboard layout, and so on.
He subsequently discusses 23 storage and tracking mechanisms that can be used to follow people online, such as cookies, bounce tracking, and CNAME cloaking.
"The technologies described in this document are not theoretical – they are deployed at scale against billions of people every single day," Hanff concludes. "Understanding them is the first step. Building the tools to detect and expose them is the next."
Google did not respond to a request to comment. ®
Source: The register