The web forked in 2010. One version optimized for humans with browsers. Another for everything else—search engines, accessibility tools, automation systems, anything that needed to access web data programmatically. The web became delightful to browse and fundamentally harder to automate in the same architectural moment.
In September 2010, Twitter rebuilt its platform around "a new architecture almost entirely in JavaScript." The goal was user experience: cache visited pages in memory, switch between them instantly, eliminate the jarring full-page reloads that made websites feel clunky. For humans browsing Twitter, navigation became delightfully smooth. For programmatic systems trying to access that same content, the visible HTML no longer represented the actual data.
Twitter's engineering team knew they were creating a dual web. Their announcement acknowledged upfront:
"In order to support crawlers and users without JavaScript, we needed a rendering system that runs on both server and client."
They built it using Mustache templates. Interactive user experience came first, programmatic access second.
The frameworks followed. Backbone.js arrived in 2010, AngularJS in October 2010, React in May 2013, Vue.js in early 2014. Each made building single-page applications easier. The consumer web optimized relentlessly for the human browsing experience Twitter had validated.
Google had already recognized the problem. Their AJAX crawling scheme from October 2009 required websites to maintain parallel HTML snapshots for crawlers through escaped fragment URLs. By 2015, they deprecated the approach, claiming they could "generally" render JavaScript. That qualifier carried years of operational complexity.
Extracting data from a website used to mean parsing HTML—reliable, deterministic, fast. After 2010, it meant automating a browser, waiting for JavaScript to execute, handling race conditions where content might or might not have loaded. What had been a simple HTTP request became a timing puzzle with dozens of failure modes. The visible HTML no longer contained the dataset.
Twitter reversed course in 2012, moving rendering back to the server and reducing page load time by 80%. They acknowledged the single-page approach "lacked support for various optimizations available only on the server." But the frameworks had already made client-side rendering the path of least resistance for developers. Twitter's reversal became a cautionary tale that the rest of the web largely ignored.
We encounter this fork daily building web agent infrastructure at scale. Every "wait for element to appear" in our system traces back to that 2010 decision. We're not waiting for network latency—we're waiting for JavaScript execution that might take 50ms or 5 seconds depending on client-side conditions we can't predict. Reliability at scale means accounting for timing variance that didn't exist when the web was HTML all the way down.
This history explains why reliable web automation requires infrastructure depth that seems disproportionate to the task. We're not over-engineering. We're accounting for fifteen years of architectural decisions that prioritized one access pattern over another. The web optimized for human delight and made programmatic access an accommodation problem. We're still navigating that trade-off.
Things to follow up on...
-
Analytics tracking broke: Single-page applications created immediate problems where only the first pageview was tracked because the page never reloaded during user sessions, forcing analytics platforms to completely redesign their tracking approaches.
-
Puppeteer's rapid adoption: Google released Puppeteer in 2017 as a Node.js library for controlling Chrome browsers, and usage among developers rose from 27 percent in 2019 to 37 percent in 2021 as JavaScript-heavy sites became the norm.
-
The hashbang URL scheme: Google's AJAX crawling proposal required websites to use "!" as a token in URLs (the "hashbang" format) and provide HTML snapshots through escaped fragment URLs, creating a parallel rendering system that lasted six years before deprecation.
-
Twitter's four-month build: The entire single-page application architecture was engineered in four months by seven core engineers, demonstrating how quickly a small team could reshape a major platform's architecture and inadvertently influence the entire web's evolution.

