Browser Fingerprinting in 2026: How It Works, Why Regulators Are Cracking Down, and How to Defend Against It

·15 min read·

In the summer of last year, thousands of Dutch women received a letter. A health provider cooperating with the Dutch public health services — a body they had no direct relationship with, and no meaningful choice about — had suffered a data breach. Their full record was exposed. Name, address, medical context, and their BSN — the Dutch social security number that stays with you for life. You don't rotate a BSN. You can't revoke one. Whatever ends up in a breach file with that number attached is there forever, and nothing they or anybody else can do changes that.

That's the feeling I want you to hold onto while reading this post, because it is the same property that makes browser fingerprinting a serious privacy problem: some identifiers cannot be taken back. A leaked password you reset. A leaked cookie you delete. A leaked BSN, or a canvas hash derived from your GPU's rendering quirks, you live with. Fingerprinting is not a cookie-banner problem. It is the quiet construction of a permanent identifier you never agreed to and cannot remove.

Most people still think privacy online is a fight about cookies. It isn't, not anymore. Cookies are legible — you can see them, delete them, refuse them in a banner. The tracking that actually worries regulators in 2026 is the kind you cannot see at all: browser fingerprinting. No file is stored on your device. No banner asks permission. A few hundred milliseconds of invisible JavaScript is enough to identify your specific laptop across every site you visit, often with greater than 99% accuracy.

This post walks through what fingerprinting is, how the techniques actually work at the code level, how European regulators treat it, and what defenses actually hold up. It's aimed at engineers, site operators, and anyone who keeps running into "but we don't use cookies" as a compliance argument.

The Short Version#

What Browser Fingerprinting Actually Is#

A fingerprint is a hash derived from a set of attributes your browser exposes to JavaScript. The attributes on their own look harmless: how your GPU renders a single line of text, how many CPU cores you have, which fonts are installed, the exact audio processing characteristics of your sound card. Individually, they have low entropy. Combined, they form a signature distinctive enough to pick your specific device out of a crowd of hundreds of millions.

The critical difference from cookies: nothing is stored on your device. The fingerprint is regenerated on every visit from attributes the browser freely provides. There is no file to delete, no banner to dismiss, no setting to toggle. You cannot opt out by clearing your browser data because there is nothing to clear.

That property — statelessness — is exactly what makes fingerprinting a serious privacy problem.

Why Fingerprinting Is a Serious Privacy Risk#

Cookie banners, third-party cookie deprecation, "Reject All" buttons, private browsing, clearing site data — none of it affects fingerprinting. A site that cannot set a cookie can still compute your canvas hash on page load and identify you as the same returning visitor.

2. It Works Across Sites#

The same canvas hash is the same canvas hash on every site using the same fingerprinting service. Commercial vendors like Fingerprint Pro, SEON, and Sift operate Device Reputation Networks — shared fingerprint databases spanning thousands of customer sites. A fingerprint seen on Site A and Site B belongs to the same device, even if you've never logged in and never accepted a cookie.

3. Fingerprints Can Be Linked to Real Identities#

This is the part that escalates fingerprinting from "tracking" to "surveillance infrastructure." Credit bureaus — TransUnion, Experian, and their counterparts — accept fingerprint hashes from customer companies and match them against their own identity databases. The pipeline looks like this:

  1. A website embeds a fingerprinting SDK.
  2. The SDK produces a hash on your browser.
  3. The hash is sent server-side to the site's backend.
  4. The site (or its fraud vendor) forwards the hash to a credit bureau.
  5. The bureau returns a real name, address, and credit profile.

No real-time fraud needs to occur. Hashes can be stored and linked retroactively, at any time, with no user interaction. No consent is asked for, and the matching happens entirely server-to-server where it is effectively undetectable from the browser.

4. Detection Is Hard and Getting Harder#

Modern fingerprinting uses OffscreenCanvas inside Web Workers, moving the rendering operation off the main thread where most privacy scanners hook. Commercial services run signal aggregation server-side so the actual visitor ID never appears in client JavaScript. And ML-based detection systems — Cloudflare, PerimeterX, DataDome — can identify anti-fingerprinting tools by recognising their randomization signatures, which can make a privacy-conscious user more identifiable than an average one.

How Fingerprinting Works: The Techniques#

Active, JavaScript-based fingerprinting relies on a small set of high-entropy vectors. Here's each one with the actual mechanism.

Canvas Fingerprinting — The Workhorse#

Canvas fingerprinting is the single most powerful conventional vector. The trick: draw text and shapes to an invisible <canvas> element, read back the pixel buffer, hash it. The pixel output varies by GPU model, driver version, antialiasing algorithm, font rasteriser, and OS graphics stack — enough variance to uniquely identify ~80–90% of browsers from the canvas hash alone.

The classic minimal example:

const canvas = document.createElement('canvas');
canvas.width = 280;
canvas.height = 40;
const ctx = canvas.getContext('2d');
ctx.textBaseline = 'top';
ctx.font = '14px Arial';
ctx.fillStyle = '#f60';
ctx.fillRect(125, 1, 62, 20);
ctx.fillStyle = '#069';
ctx.fillText('Cwm fjordbank glyphs vext quiz, 😀', 2, 15);
ctx.fillStyle = 'rgba(102, 204, 0, 0.7)';
ctx.fillText('Cwm fjordbank glyphs vext quiz, 😀', 4, 17);
const hash = sha256(canvas.toDataURL());

Semi-transparent fills and emoji are deliberate. They force the browser through hardware-dependent rendering paths — alpha compositing, emoji glyph substitution — which maximise the entropy extracted per pixel.

Modern evasion uses OffscreenCanvas inside a Worker, so the canvas read happens off the main thread and bypasses scanners that only hook HTMLCanvasElement.prototype.toDataURL on window.

WebGL Fingerprinting — The GPU Itself#

WebGL renders 3D graphics via the GPU, and it exposes dozens of parameters about that GPU: vendor, renderer string, driver version, shader precision ranges, texture size limits. A single call to getParameter(UNMASKED_RENDERER_WEBGL) can return something like "NVIDIA GeForce GTX 1080/PCIe/SSE2" — a string with enormous identifying power.

const gl = canvas.getContext('webgl');
const debug = gl.getExtension('WEBGL_debug_renderer_info');
const vendor = gl.getParameter(debug.UNMASKED_VENDOR_WEBGL);
const renderer = gl.getParameter(debug.UNMASKED_RENDERER_WEBGL);

Recent research puts WebGL-based identification at well above 90% on desktop (one 2025 study reports 99% desktop / 94% mobile). Combined with canvas and AudioContext, identification routinely pushes past 99%.

AudioContext — The Stealth Vector#

The Web Audio API exposes a full audio processing graph to JavaScript. Generate an inaudible oscillator tone, run it through a dynamics compressor and analyser, read the output samples. The nanosecond-level floating-point differences in how your OS, driver, and sound hardware process that signal are stable and distinctive.

const ctx = new OfflineAudioContext(1, 44100, 44100);
const oscillator = ctx.createOscillator();
const compressor = ctx.createDynamicsCompressor();
oscillator.type = 'triangle';
oscillator.frequency.value = 10000;
oscillator.connect(compressor);
compressor.connect(ctx.destination);
oscillator.start();
ctx.startRendering().then(buffer => {
  // Hash the resulting samples — device-specific
});

AudioContext contributes ~50–60 bits of entropy on its own, and it leaves no visual trace. The user sees and hears nothing.

Dozens of low-to-medium entropy signals are trivially accessible:

APISignalEntropy
navigator.hardwareConcurrencyCPU core countLow
navigator.deviceMemoryRAM bucket (GB)Low
navigator.userAgentData.getHighEntropyValues()Device model, OS versionMedium
navigator.maxTouchPointsTouch points supportedMedium
screen.width/height/colorDepthDisplay geometryLow–Medium
matchMedia('(color-gamut: p3)')HDR supportMedium
document.fontsInstalled fontsHigh

Each individually looks harmless. The point of fingerprinting is that it combines all of them.

WebGPU — The New High-Entropy Vector#

WebGPU is the successor to WebGL and exposes significantly more detailed hardware information: GPU architecture, driver model, D3D shader model, Vulkan driver version, adapter limits, memory heap sizes. It's currently enabled by default in Chrome and is already being used for fingerprinting in the wild.

if (navigator.gpu) {
  const adapter = await navigator.gpu.requestAdapter();
  const info = adapter.info;
  // vendor, architecture, device — extremely high entropy
}

Because WebGPU reports hardware capabilities directly, it is difficult to spoof consistently. It's also a powerful cross-check — services compare WebGPU adapter info with WebGL renderer strings, and any inconsistency is treated as evidence of a privacy tool in use.

TLS and HTTP/3 Fingerprinting — Below JavaScript#

Passive network-layer fingerprinting (JA3/JA4 for TLS, QUIC connection ID analysis for HTTP/3) works without any JavaScript at all. The TLS Client Hello your browser sends — cipher suites, extensions, elliptic curve order — hashes to a distinctive value characteristic of your browser and OS.

This matters because TLS fingerprinting cannot be blocked by ad blockers, extensions, or browser privacy settings. It is done by any network observer (ISP, CDN, government) and is consistent across every site you visit over the same connection.

The Commercial Fingerprinting Ecosystem#

The techniques above are the raw material. The commercial infrastructure built on top of them is what turns fingerprinting into a privacy problem at scale.

ServiceWhat It DoesRisk Profile
Fingerprint Pro42+ client signals + server-side ML; cross-customer Device Reputation NetworkCross-site persistent tracking
SEONFingerprint + built-in email/phone/social lookup pipelineDirect device-to-identity linkage
SiftML fraud platform; thousands of signals across customer networkFraud scoring + profiling
Arkose LabsBot detection + interactive challenges; 225+ risk signalsPersistent device scoring
AccertifyAmEx-owned; direct PII linkage via card networkFingerprint → real identity
TransUnion / ExperianHash-to-identity matching against credit databasesFingerprint → real name, address, credit profile

These services are useful for fraud prevention — that is the legitimate use case. The problem is that the same infrastructure, once deployed, can be used for cross-site tracking, retroactive identity linkage, and profiling, with the user having no visibility into any of it.

How European Regulators See Fingerprinting#

There is no ambiguity in the law. Multiple data protection authorities and the EDPB have stated clearly that fingerprinting falls under the same consent regime as tracking cookies.

ePrivacy Directive Art. 5(3)#

"Storage of information, or the gaining of access to information already stored, in the terminal equipment of a subscriber or user is only allowed on condition that the subscriber or user concerned has given his or her consent."

The EDPB's 2023 guidelines on tracking confirm that fingerprinting constitutes "accessing information stored in the terminal equipment" — the GPU rendering state, the audio hardware characteristics, the installed fonts. Prior, specific, informed, unambiguous consent is required before any of it is collected.

The "strictly necessary" exemption does not apply. Under the CJEU's Planet49 ruling (C-673/17), exemptions are narrowly construed. A website can be served without fingerprinting; therefore fingerprinting is not strictly necessary.

GDPR Art. 4(11), Art. 6, Art. 32#

Active Enforcement#

The pattern is consistent: fingerprinting without valid consent is a recognised, actively-enforced violation. "We don't use cookies" is not a defence.

How to Defend Against Fingerprinting#

Defenses fall into three layers: browser, extension, and behaviour. None of them is perfect, and some well-intentioned approaches actually backfire.

Browser Choice Matters Most#

BrowserProtection LevelNotes
Tor BrowserStrongestUniform canvas; WebGL, WebGPU, AudioContext disabled or normalised; all users look identical
BraveStrongShields randomise canvas, block WebGL in strict mode, block AudioContext in fingerprinting mode
Safari (with ITP)Strong in third-party contextsBlocks third-party canvas/font/storage; Safari 17+ injects noise into AudioContext in Private Browsing
Firefox (ETP Strict + RFP)Medium–Strongprivacy.resistFingerprinting normalises many APIs; ETP blocks cross-site canvas
Chrome (default)WeakWEBGL_debug_renderer_info restricted cross-origin; most APIs otherwise fully accessible

Tor Browser is the only option that gives you meaningful k-anonymity — every Tor user looks the same, so your fingerprint identifies you as "a Tor user," not as a specific individual. For daily browsing, Brave and Safari are the strongest practical defaults. Firefox with privacy.resistFingerprinting enabled is competitive but occasionally breaks sites.

Why Randomization Backfires#

The instinct is: if my canvas hash gives me away, why not just randomise it on every read? Turns out ML-based detection systems in 2025–2026 are very good at spotting this. A canvas hash that changes on every call is highly unusual — it is a fingerprint, just a different kind. Commercial services now flag randomization as an "anti-fingerprinting tool" signal, which in some contexts (fraud scoring, bot detection) treats you as more suspicious than a default-configured browser.

The current wisdom from defensive research: consistent noise beats randomization. Stable, realistic, hardware-plausible fingerprints are harder to detect than obviously random ones. This is why Tor Browser's approach — make everyone identical — works while naive randomization doesn't.

Extensions: Help With Caveats#

What Site Operators Must Do#

If you operate a site in the EU or serving EU users:

  1. Inventory every fingerprinting-adjacent SDK. Fraud tools, analytics, session replay, A/B testing, anti-bot services — many of them fingerprint by default.
  2. Gate them behind consent. The fingerprinting call cannot fire before the user has given specific, informed, unambiguous consent to that purpose. Pre-consent fingerprinting is the most severe violation in this space.
  3. Document your Art. 6(1)(f) balancing test if you're relying on legitimate interest for fraud-specific fingerprinting. It must be narrowly scoped, transparent in your privacy notice, and limited to the minimum necessary signals.
  4. Be honest in your cookie banner. If you are fingerprinting, the banner must disclose it — not bury it under "essential cookies." The EDPB is explicit on this point.
  5. Sign DPAs with your vendors. Fingerprint Pro, SEON, Sift, Arkose, Accertify are all processors under Art. 28. No DPA, no lawful processing.

Where This Is Going#

Three trends define the 2026 landscape:

Fingerprinting is not a theoretical risk or an edge case. It is how tracking actually works on the modern web, and it is specifically what regulators are now enforcing against. If your privacy programme ends at cookie consent, it ends in the wrong place.


Part of an ongoing series on web privacy, built on a structured knowledge base of GDPR and ePrivacy jurisprudence. Related reading: the privacy audit series.

[S.01]§ Related

I Built an AI That Audits Websites for Privacy. Here's What It Found.

A data engineer who builds tracking infrastructure by day built an AI that audits everyone else's. 10 major websites scanned. Average score: 4.9 out of 10.

4 min read