A study of nearly 300,000 domains found that just 10% had adopted llms.txt, the proposal that asked website owners to place a plain text summary at their domain root for AI systems to read. Google's John Mueller compared it to the discredited keywords meta tag. Statistical analysis showed no measurable effect on how often a domain gets cited by AI systems.
And yet the file found a use case sideways. An agent trying to understand an API's webhook documentation benefits from a clean, structured entry point rather than parsing thousands of HTML pages. The file works as a context primer. A way to orient, more than a way to rank.
That accidental usefulness pointed toward something larger. The web had no vocabulary for talking to non-human visitors. llms.txt was the first attempt at one, and its limitations made the next step legible. A static file describes what a site contains, and that's where its relationship with the visitor ends. Here is what we are. Take it or leave it.
What followed was MCP, Anthropic's Model Context Protocol. By early 2026, the ecosystem had reached 97 million monthly SDK downloads and over 10,000 public server implementations. When an independent audit tracked only servers that were actually running and remotely accessible, the count was closer to 1,400. That gap is worth sitting with. Thousands of organizations built an MCP server. Far fewer left the lights on.
But the ones that did changed something fundamental about what their web presence offers. An MCP endpoint exposes what a site can do. The protocol distinguishes between Resources (read-only data, like a catalog) and Tools (executable functions, with side effects). A code hosting platform's MCP server lets agents create issues, open pull requests, trigger CI/CD workflows. A site that previously said "here is our documentation" now says "tell me what you need done." The posture becomes one of serving.
Eighty-one percent of companies running MCP servers have fewer than 200 employees. The median server exposes just five tools. These aren't organizations with dedicated AI strategy teams.
Look at who's making that shift. Eighty-one percent of companies running MCP servers have fewer than 200 employees. A Romanian grocery store built one. They're businesses that looked at where their next customers were coming from and started building a door for them. The median server exposes just five tools. Many are clearly in "we should probably have one of these" territory. But even a tentative MCP endpoint represents a different relationship than a static file ever could. The site is listening.
The progression is now reaching the standards layer. The W3C AI Agent Protocol Community Group, formed in mid-2025, is drafting specifications for how agents discover, identify, and collaborate across the open web. It remains a community group, not a formal working group with binding authority. But activity is accelerating around it. Google and Microsoft jointly developed WebMCP, shipping an early preview in Chrome 146 that lets web pages declare structured capabilities directly to agents through the browser. A separate community group proposed cryptographic verification of agent actions, addressing the audit trail gap that regulators are beginning to require. When standards bodies start drafting specifications for how to welcome a class of visitor, the relationship shifts again. A file broadcasts. A protocol endpoint converses. A web standard encodes an expectation into how browsers and servers behave by default.
Two years ago, most websites treated non-human traffic as something to block or ignore. Now a grocery store in Romania is exposing executable functions to agents, and the W3C is debating how browsers should formally surface site capabilities to non-human visitors. The posture shift happened faster than the infrastructure to govern it.

