A company sits on 50,000 customer contracts. They want to identify which customers might buy a new product. They've never tried it before—not because they didn't want to, but because attempting it would lose money at any price point.
Until now.
Box CEO Aaron Levie keeps hearing variations of this story from enterprise customers deploying AI agents. Most aren't replacing existing work. They're enabling tasks companies never attempted in the first place.
"Some of the most interesting use-cases that keep coming up for AI agents are on bringing automated work to areas that the companies would not have been able to apply labor to before."
His contract example is telling:
"This is not something that they would have people ever do. They never said, 'Oh, let's have 50 people go read all the contracts again.' It just never happened."
But if an agent could do it for $5,000? "They would do that all day long."
Labor arbitrage misses the point. These companies are crossing a threshold from impossible to viable.
Why It Was Actually Impossible
At TinyFish, we build enterprise web agent infrastructure—the systems that run reliable workflows across the live web at scale. That work shows you something: the barrier to certain tasks wasn't cost. It was architecture.
What does "reading 50,000 contracts" actually mean operationally? Those contracts aren't in a neat database. They're scattered across procurement systems, email attachments, vendor portals, legacy repositories, third-party platforms. Each with its own authentication requirements, access patterns, data formats. When data lives across disconnected systems like this, application-level fragmentation makes it difficult or impossible for an application to access and use data stored by another application.
That's not expensive. That's negative ROI at any price point.
The web wasn't built for programmatic access at enterprise scale. Modern websites deploy sophisticated anti-bot systems that track mouse movements, scrolling patterns, typing behaviors. They build unique browser fingerprints from screen resolution, installed fonts, device characteristics. These aren't CAPTCHAs you solve with clever workarounds. They're behavioral analysis systems that detect automation through patterns humans can't replicate.
Then there's the maintenance burden: websites change constantly, breaking automations. The cost of keeping things working exceeds any value you could extract from work that wasn't generating returns in the first place. For tasks that never made economic sense to attempt, that math never closes.
What This Actually Unlocks
The conventional framing goes: "What would you automate with cheaper labor?" Levie's customers are discovering something else. They're finding categories of work that were always valuable but never viable. Analyzing every customer interaction for upsell signals. Reviewing every contract for risk patterns. Monitoring competitor moves across fragmented web surfaces.
Work that humans were never doing because it was "too expensive to send people to go off and look through."
Making invisible work visible and economically viable requires infrastructure that can navigate authentication labyrinths, handle anti-bot systems, maintain reliability across thousands of concurrent sessions, and turn unstructured web surfaces into structured, reusable data. You're not replacing human labor when you build that infrastructure. You're enabling work that never existed because the web outgrew the tools we had to interact with it programmatically.
Companies asking "what should we automate?" haven't grasped what's happening. The real question: what work has been invisible because we never had the infrastructure to attempt it? That's not a 20% efficiency gain. That's a new category of work becoming economically viable.
The vast majority of agent work that Levie sees emerging isn't replacing jobs. It's filling a void that's been there all along. Work that was always valuable but technically impossible to attempt. Infrastructure finally catching up to what the web became.
Things to follow up on...
-
Machine learning bot detection: Modern anti-bot systems analyze patterns in web traffic using machine learning models trained on millions of interactions to identify automation through anomalies in mouse movements, click timing, and page navigation.
-
Enterprise data fragmentation scale: 82% of enterprises report that data silos disrupt their critical workflows, with 68% of enterprise data remaining unanalyzed because it's scattered across disconnected systems.
-
Production cost realities: Headless browser rendering with residential proxies consumes 2MB per page versus 250kb without, making bandwidth costs prohibitively expensive as web automation scales to enterprise volumes.
-
Healthcare automation impact: One healthcare organization eliminated redundant manual updates and reduced effort by up to 88% by centralizing fragmented data and automating recurring processes that were previously too tedious to attempt.

