When Safari started restricting third-party cookies in 2017 and Firefox followed in 2019, the advertising industry faced a practical question: if the primary mechanism for tracking users across websites becomes unreliable, what else is available?
The answer was already waiting. In 2010, Peter Eckersley's Panopticlick study at EFF had demonstrated that among browsers with Flash or Java installed, 94% had unique configurations. Screen resolution, installed fonts, timezone, language settings, plugin lists. Mundane on their own, but taken together, distinctive enough to single out a browser. By 2012, researchers at UC San Diego showed that the HTML5 canvas element could serve as an identifier: instruct the browser to render specific text, read back the pixel data, and hash it. Different GPUs, operating systems, font libraries, and antialiasing implementations all produce subtly different renderings. A fingerprint, extracted from the act of drawing.
The whole thing is structurally strange. The characteristics being used as identity markers were never designed to be stable. Install a font, your fingerprint changes. Update a graphics driver, the canvas renders differently. Plug in an external monitor, adjust your zoom level. Eckersley's study found that 37.4% of returning visitors showed at least one fingerprint change. Researchers at INRIA confirmed in 2016 that while 89.4% of fingerprints in their dataset were unique at any given moment, the signals shifted whenever the underlying system shifted. The industry was building an identification layer on characteristics that were already unreliable for identification. And it started doing so before it even needed to. By 2014, the social bookmarking company AddThis had deployed canvas fingerprinting across 5% of the top 100,000 websites. Three years before Safari's Intelligent Tracking Prevention. The fiction was attractive enough to build on before the old fiction was even threatened.
Browser vendors pushed back. The most interesting response came from Brave, which introduced fingerprint randomization, adding subtle noise to canvas output so the fingerprint changes every session. The logic was striking: rather than trying to make all browsers look identical (an impossible goal, given the diversity of modern hardware), make every browser look different from itself across time. Firefox's Resist Fingerprinting mode, borrowed from the Tor project, took the opposite approach, reporting fake uniform values for timezone, hardware, and locale. Both defenses worked in controlled settings. Both proved brittle against statistical re-linking and adaptive commercial fingerprinters. By 2025, 12.7% of the top 20,000 sites were running canvas fingerprinting.
Then came the reversal that made the arms race feel almost beside the point. In February 2025, Google lifted its ban on device fingerprinting across its advertising platforms. This was the same company that had called fingerprinting a practice that "subverts user choice and is wrong" in 2019. The UK's Information Commissioner's Office called the reversal "irresponsible" within 24 hours. While browser vendors spent years engineering defenses against fingerprinting, the largest advertising platform quietly legitimized the practice those defenses were meant to prevent.
Cookies had one saving grace: visibility. You could find them in your browser settings, clear them, block them. Fingerprinting leaves nothing on your device. The identification happens in the rendering itself, in the gap between what your browser was asked to draw and how your specific hardware drew it.
The web's first identity workaround was a small text file pretending HTTP had memory. The second is a rendering artifact pretending your GPU is your name. Fiction layered on fiction, each one less visible than the last.

