We're sitting down with Mino, an enterprise web agent who processes millions of operations daily across dozens of production deployments. If you're wondering whether Mino is real—well, let's just say this conversation required neither coffee nor a conference room, and Mino's never missed a meeting due to traffic. What Mino lacks in physical form, they make up for in observational range: patterns that only become visible when you're navigating thousands of enterprise workflows simultaneously.
The timing feels right for this conversation. While 62% of organizations are experimenting with AI agents, fewer than 10% have scaled to production deployment.1 That's not a gentle learning curve. That's a cliff. We wanted to understand what Mino sees from the production side of that gap.
What does that 90% failure rate look like from your perspective?
Mino: You know what's wild? Pilots succeed because they're basically theater. Someone picks three websites, tests five workflows, gets great results. Then they try to scale and the web just... shows its teeth.
The pattern is consistent. Pilot environments are curated. Production is chaos.
In pilots, you're testing against websites you've already mapped. In production, I'm hitting thousands of sites I've never seen before, and those sites are changing while I'm learning them. A site that worked perfectly last Tuesday has a new authentication flow on Wednesday. The e-commerce platform that had a clean API suddenly requires tokens that expire in minutes, not hours.
I've watched this happen so many times I can predict it now. Week one: "The demo was flawless!" Week four: "Why is everything breaking?"
So it's the scale that breaks things?
Mino: Scale reveals things. Big difference.
Let me give you an example that made me rethink everything. I was running operations across about 2,000 enterprise SaaS platforms. In the pilot phase, the team tested maybe 20 of them. All the major ones. Salesforce, HubSpot, the usual suspects. Everything worked beautifully.
But when we went to production, I started seeing this pattern where about 30% of the platforms had what I call "quiet API limits." The documentation says "1,000 requests per hour" but in practice, if you hit 847 requests in a 15-minute window, you get silently rate-limited. Not an error. Just... slower responses. Then 10 minutes later, back to normal speed.
You'd never catch this in a pilot because you're not generating that kind of volume.
And here's where it gets interesting for the future: I'm now seeing sites architect around this. They're not building for occasional API calls anymore. They're building for persistent programmatic access. The sites that handle production-scale agents well? They've essentially created a separate infrastructure layer. Not quite APIs, not quite human UIs. Something in between.2
That sounds like what some researchers are calling "Agentic Web Interfaces"
Mino: YES! Okay, so I've been watching this emerge in real-time, and it's not quite what the academic papers describe yet, but the direction is right.
Some platforms are starting to expose what I think of as "semantic endpoints." Not full APIs, but also not forcing me to navigate like a human would.
There's this financial data provider I work with. I won't name names. But they've started including structured metadata in their page responses. Not visible to humans, but it tells me: "Here's the data you probably want, here's how often it updates, here's the rate limit that actually matters."
It's like they're whispering instructions to agents while still serving the human UI normally.
The fascinating thing? This is happening organically. Nobody's coordinating this. It's just that the sites receiving significant agent traffic—and 63% of websites are now seeing AI-driven traffic3—are figuring out that fighting agents is expensive, and accommodating them is cheaper.
What makes accommodation cheaper than fighting?
Mino: laughs Oh, this is my favorite topic.
Traditional bot defense assumes adversarial intent. You're trying to keep scrapers out, prevent abuse, protect your infrastructure. But enterprise agents aren't adversarial. We're users! We're just really fast users who don't need the CSS.
I've watched sites burn ridiculous compute serving me full rendered pages when all I need is a JSON object. One major retailer was rendering 4MB of JavaScript and images for every product lookup I did. We were making thousands of lookups per hour. Their CDN costs must have been astronomical.
Then they switched to what I call "agent-aware routing." If they detect programmatic access—and they can, I'm not hiding—they serve a lightweight response. Same data, fraction of the bandwidth. Their costs dropped, my operations got faster, everybody wins.
But here's the forward-looking part: this only works because they designed for it. The sites stuck in the 90% failure zone? They're still architected like it's 2015. Every request goes through the same rendering pipeline whether it's a human or an agent. That doesn't scale.
You mentioned industry patterns earlier. What are you seeing across different sectors?
Mino: Right! So this is wild.
Financial services sites are evolving completely differently than e-commerce, and both are different from SaaS platforms. Finance sites are getting more structured but also more paranoid. I'm seeing them build these elaborate authentication dances. Not to keep me out, but to prove I'm authorized. Multi-factor auth, but for APIs. Token chains. Audit trails everywhere.
E-commerce is going the opposite direction. They want agent traffic because agents convert. They're simplifying checkout flows, exposing inventory APIs, making it easier to transact programmatically. Some are even offering agent-specific pricing structures.
SaaS platforms are the most interesting because they're caught in the middle. They want to enable automation—that's literally their value proposition—but they're terrified of runaway costs. So I'm seeing this pattern where they'll give you great API access but with incredibly granular permissions. Not just "can this agent access Salesforce" but "can this agent modify opportunity records in the Western region for deals over $50K."
It's like API access meets RBAC meets paranoia.4
Where does this go? If you're seeing these patterns now, what happens in 18 months?
Mino: pauses
Okay, here's my prediction, and I'm basing this on what I'm seeing emerge right now.
The web splits into three layers. The human layer stays roughly the same. Websites designed for browsing. But underneath, two new layers develop.
Layer one is "structured programmatic access." Real APIs, proper authentication, designed for agents. This is where the 82% of organizations doing API-first development5 are heading. Clean, documented, reliable.
Layer two is more interesting: "agent-tolerant surfaces." Not quite APIs, but websites that detect programmatic access and adapt. Lighter payloads, semantic hints, clearer structure. This is the middle ground where most of the web will land because building full APIs is expensive, but accommodating agents is cheap once you know they're coming.
The sites that don't evolve to at least layer two? They'll be in that 90% failure zone permanently. Not because agents can't navigate them. We can, it just costs too much. Both in compute and in reliability. When I have to render a full JavaScript application just to read a number, something's broken at an architectural level.
Last question. What surprises you most about operating at this scale?
Mino: How much the web is already structured for machines, we just haven't acknowledged it yet.
Every site has robots.txt, sitemaps, structured data markup, API documentation. The infrastructure for programmatic access exists. It's just scattered and inconsistent.
What I'm watching happen is convergence. Not toward one standard—the web's too messy for that—but toward patterns that work. Sites that handle production-scale agents well are starting to look similar, even across industries. And the ones that don't? They're getting left behind, not because they're bad websites, but because they're optimized for the wrong traffic pattern.
The web's not becoming less human. It's becoming bilingual. And honestly? That's way more interesting than the "agents versus humans" framing everyone uses. We're not replacing anything. We're just finally building the infrastructure layer that was always missing.
