Gartner coined a term late last year that's more useful than the prediction it arrived with. The prediction: 90% of B2B buying will be AI-agent intermediated by 2028, pushing $15 trillion through agent exchanges. No published methodology accompanies it. Treat it as directional.
The term, though, is worth sitting with: agent engine optimization.
For twenty-five years, the commercial web has been organized around human attention. Landing pages, hero images, the careful choreography of a conversion funnel. SEO became a discipline because search algorithms rewarded signals that correlated, however imperfectly, with what humans found useful. The entire surface of online commerce was designed to persuade a person looking at a screen.
Agent engine optimization starts from a different premise. The audience evaluating your product listing or pricing page is parsing structured data, reading capability declarations, checking API endpoints. If it can't parse you, you don't exist.
Google and Microsoft confirmed in early 2025 that they use schema markup to feed their generative AI features. ChatGPT followed, confirming structured data influences which products surface in its results. AI-referred web sessions jumped 527% year-over-year in the first half of 2025. A new channel opening, fast.
Meanwhile, zero-click Google searches rose from 56% to 69% over the past year. And roughly 69% of websites implement no schema markup at all. Zero. In a world where agents increasingly mediate discovery, those sites are simply absent.
The businesses most exposed are the ones whose value has always lived in things machines can't easily parse. Their e-commerce works fine for human visitors. The new layer just can't see them.
A boutique consulting firm whose reputation travels through referral networks. A specialty manufacturer whose differentiation is the relationship with its sales engineer. A family-run hotel whose appeal is the room it puts you in, not the spec sheet. These businesses built real value in a world where humans did the evaluating. That world is developing a second layer, and the second layer can't see them.
Microsoft's NLWeb initiative, built on Schema.org vocabulary, lets AI agents query website content conversationally. A vendor ecosystem is growing around making businesses "agent-readable" at the CDN layer, without changing what human visitors see. The two webs coexist, for now.
But coexistence has a direction. When agents handle the initial filter and humans only see what agents surface, the machine-readable layer becomes the bottleneck. Parsability becomes a competitive prerequisite. The companies already built around structured data and API-first architectures gain an advantage they didn't have to earn. Cloud platforms, SaaS vendors, businesses whose products were always described in specs rather than stories. They were machine-readable before machine-readability carried commercial weight.
The businesses that invest in structured data will adapt. The ones worth paying attention to are the ones that won't know they've become invisible until the phone stops ringing.

