Martijn Koster wanted to call it RobotsNotWanted.txt. His server at Nexor had been crashing under a crawler called Websnarf, and his proposed fix was modest: a text file that webmasters could place at the root of their servers, listing which paths bots should avoid. The name was too long for DOS. Roy Fielding weighed in. By July 1994, when Koster announced "A Standard for Robot Exclusion" on the www-talk mailing list, it was robots.txt.
The file itself was almost beside the point. The web in 1994 was small enough that the people writing crawlers were reading the same mailing list where Koster posted his proposal. He noted that most robots in operation already supported it or had promised to soon. The promise held the system together. The file was where you wrote it down.
Thirty-one years later, in December 2025, the RSL Collective published version 1.0 of Really Simple Licensing. XML-based, hosted at /license.xml, it can specify licensing types, payment terms, attribution requirements, and granular permissions distinguishing AI training from search indexing. Reddit, Yahoo, Ziff Davis, Medium, Quora, and O'Reilly Media signed on. The co-founder, Eckart Walther, co-created RSS. The lineage is deliberate.
RSL's founders point to ASCAP and BMI as models: collective licensing bodies that let musicians get paid when their work is performed. ASCAP's leverage came from federal consent decrees and decades of litigation that built the enforcement regime. The format was the last layer, added once the legal infrastructure already existed.
Robots.txt was eventually formalized as RFC 9309 in 2022, twenty-eight years after Koster's proposal. The formalization changed nothing about compliance. A Duke University study, analyzing 130 bots over 40 days, found AI search crawlers had the lowest compliance of any category, with fewer than 40% even checking the file within a week. Compliance dropped further as the rules got stricter. A robots.txt that disallowed a bot was less likely to be followed than one that merely set rate limits.
The most vivid evidence came from the courtroom. Ziff Davis modified its robots.txt according to OpenAI's own published instructions for opting out of crawling. Scraping activity increased. A federal court, ruling on the resulting claims, called robots.txt:
"More akin to a sign than a barrier."
The sign exists. The forest has changed around it.
RSL is a more detailed sign. It says more, with richer vocabulary. No major AI company has publicly committed to reading it. The crawl-to-referral ratio for some AI companies now exceeds 73,000 to 1. The actors who would need to honor the file gain more from ignoring it than from reading it.
The instinct persists anyway. A new format, a new file placed on a server, a new hope that communicating terms clearly enough will cause them to be respected. Formats are the thing technologists know how to build. When the actual problem is incentive alignment, a file is the legible action available to people who can't compel compliance and don't yet have the legal infrastructure to demand it. Maybe RSL's real function will turn out to be the same as robots.txt's: a document that proves you tried to ask. A paper trail for the lawsuit that follows.

