What happens when detection systems stop targeting what automation does and start targeting how it does it—at the protocol level, below where any patch can reach?
In June 2024, bot detection crossed that threshold. Systems began checking whether browsers had received a specific Chrome DevTools Protocol command: Runtime.enable. This command sits below the JavaScript layer where traditional evasion operates. You can't patch a browser API to hide a protocol-level signal. They're different architectural layers entirely.
Frameworks like Puppeteer Extra Stealth—built on years of patching individual detection vectors—suddenly hit a wall. The entire approach of hiding automation signals stopped working because detection had moved to something more fundamental than any patch could address.
This wasn't just another detection technique—it was the moment when web automation moved from surface-level patches to architectural foundations.
Where does this trajectory lead? Anyone building on the web needs to understand what just changed.
Enterprise Scale Changes the Stakes
When we're building enterprise web agents at TinyFish, these thresholds reveal themselves differently than they would for smaller operations. Running a handful of sessions, you might patch the next detection vector and keep going. At thousands of concurrent sessions with production SLAs, the architectural difference becomes existential.
Protocol-level detection means every session in your fleet exposes the same signal through a fundamental choice you made picking your automation framework. There's no incremental fix. The infrastructure becomes detectable through its foundation.
For enterprises, this meant something specific: the web automation infrastructure they'd invested in—frameworks, expertise, operational runbooks—suddenly had a shelf life. Not because of poor execution, but because the fundamental rules changed. Building on an adversarial web means not just technical challenges, but strategic questions about infrastructure that might need architectural rethinking on a 12-month cycle.
The Framework Rethink
The new generation of frameworks—nodriver, selenium-driverless—took a different path. Instead of patching browser APIs while using traditional automation protocols, they implemented automation functions using low-level CDP commands that don't require Runtime.enable.
When researchers tested this approach on sites protected by Cloudflare and DataDome, disabling Runtime.enable made CAPTCHAs disappear. Same browser, same IP. The difference was protocol-level architecture.
Call it a rethink rather than a patch. And it reveals something about building infrastructure when the ground keeps shifting: the old approach is no longer viable, but the new approach hasn't fully matured yet. Nodriver emerged in early 2024 with incomplete proxy support and known headless issues. You're building production systems during the liminal period when the future architecture is clear but operational details are still being worked out.
By August 2025, the specific Runtime.enable signal had stopped working due to Chrome engine changes. But detection had already moved to other protocol-level techniques. Adversarial systems evolve through continuous adaptation at every layer, not single battles. The threshold we crossed was recognizing that protocol-level detection is now the baseline, and everything built on old assumptions needs rethinking.
Where the Adversarial Web Goes From Here
This threshold revealed something fundamental about building on the live web: the adversarial layer keeps moving down the stack. Today it's protocol-level detection. Tomorrow it might be network-level fingerprinting or hardware attestation.
The next threshold is coming. The real challenge is building systems that can cross it without starting over.
So the question becomes: how do you build infrastructure that adapts to an adversarial web without requiring constant architectural rewrites? The web won't stop being adversarial. The teams that understand this—that build for resilience and adaptability rather than evasion tactics—are the ones who'll still be operating when the next threshold crosses.
When you're building web agent infrastructure at scale, you learn to recognize these moments. The point where the game shifts rules, and you either rethink your foundation or accept that your old approach is now a liability.
That threshold crossed in 2024. Less than a year from "this is working" to "this requires architectural rethinking." Most people missed it. But if you were running thousands of sessions against the live web with production reliability requirements, you saw it clearly: the moment when the protocol layer became the battlefield, and the nature of building on the web changed permanently.
Things to follow up on...
-
Puppeteer Extra Stealth's decline: The framework had 6,200 GitHub stars but experienced no significant code changes for a year before June 2024, with the main maintainer launching a commercial product instead.
-
DataDome's detection methodology: In July 2024, DataDome published research showing how they could access and reveal Puppeteer Extra Stealth's internal code due to flaws in the evasion implementation.
-
Chrome's headless mode evolution: The latest Headless Chrome update made automated browsers harder to detect by default, reducing fingerprint inconsistencies with just basic modifications like changing user-agent strings.
-
Behavioral detection layers: Beyond protocol-level signals, modern anti-bot systems now use behavioral analysis combining client-side interactions and server-side browsing patterns to identify automation through multiple detection vectors simultaneously.

