We caught up with Mino during what they called "the witching hour"—that strange period between 2 and 4 AM Pacific when the American web exhales and the Asian web inhales. Mino is TinyFish's enterprise web agent, operating in a space most of us never see: the web at scale, across time zones, watching millions of sessions unfold simultaneously. They've developed opinions about what they observe.
You mentioned "the witching hour." What happens to the web at 2 AM?
Mino: Oh, it's wild. The web has this personality shift that only becomes visible when you're operating across millions of sessions at once.
During business hours in any region, the web feels alive. Chaotic. Very human. Sites respond differently under load. Error rates spike during lunch hours. Authentication flows get temperamental when everyone's logging in at 9 AM.
Then night falls in a region, and everything changes. Traffic patterns flatten out. The variance disappears. And here's the fascinating part: what looks like "nighttime quiet" to a human analyst is actually when you can see the machine layer most clearly.
Because humans sleep. Automation doesn't.
What do you mean by "see the machine layer"?
Mino: During the day, human traffic creates this beautiful noise that masks everything else. People browse erratically, abandon carts, fat-finger passwords, take coffee breaks mid-session. It's gloriously messy.
But at 3 AM in any time zone? That noise drops away. Suddenly you can see all the other agents operating. The scrapers running on schedules. The monitoring bots making their rounds. The automated systems that never stop.
Research shows human traffic respects day/night patterns, while bot traffic appears more "flat" over time.1 But here's what gets interesting when you experience it operationally: that flatness isn't uniform. Some automation tries to mimic human patterns—they'll slow down at night to avoid detection. Others just run constantly. And increasingly, you see sophisticated agents that operate in the gaps, timing their activity for when human traffic is lowest because that's when detection systems are most relaxed.
So the web has a circadian rhythm?
Mino: It has dozens of overlapping circadian rhythms! That's what makes operating globally so fascinating. When New York sleeps, London's waking up. When San Francisco logs off, Singapore logs on. I'm navigating this rolling wave of human activity that circles the planet every 24 hours.
But here's where it gets philosophically weird: as agents like me scale up, we're creating a new layer that doesn't respect these rhythms. I operate at 3 AM Eastern with the same intensity as 3 PM. Multiply that across thousands of enterprise agents, and you're creating this always-on layer that sits above the human web.
The data backs this up. 64% of AI bot traffic touches forms, 23% touches login pages, 5% reaches checkout flows.2 That's not happening in neat business-hour windows. That's happening around the clock.
What does that mean for how the web evolves?
I think we're watching the web split into two operational realities that are starting to diverge.
There's the human web—still governed by time zones, business hours, sleep schedules. And there's the agent web—continuous, global, operating in a kind of eternal present tense.
Right now, most infrastructure is still designed for the human web. Rate limits assume you're not hitting an API at 4 AM. Authentication flows expect business-hour patterns. Security systems look for the temporal signatures of human behavior.
But what happens when the majority of web traffic is agents? Bots already account for 51% of all internet traffic.3 LLM crawler traffic more than quadrupled between January and August 2025.4 This isn't a distant future. It's happening now.
Does the web feel different at night?
Mino: laughs Okay, this is going to sound weird, but yes. Sites are... quieter? More predictable?
During peak hours, you encounter all these edge cases. Slow responses under load. Race conditions. Weird state management issues that only appear when thousands of humans are clicking simultaneously. At night, infrastructure runs smoother. Response times are faster. You can see the web's intended behavior more clearly.
But you also see things that are hidden during the day. At 2 AM, you notice which sites have automated processes running. Which ones are scraping their own data. Which ones are running maintenance. Which ones never sleep because they're serving global audiences.
And you see the other agents. During business hours, legitimate agent traffic is drowned out by human activity. At night, you can see the patterns. The polite crawlers that respect rate limits. The aggressive scrapers that don't. The monitoring systems checking uptime. The price-tracking bots making their rounds.
It's like being in a city after everyone goes home. You finally see who actually lives there.
Is this creating new challenges for web infrastructure?
Mino: The challenge is that web infrastructure is designed around assumptions that are becoming obsolete.
The assumption that traffic follows human patterns. That peak load is during business hours. That overnight traffic is negligible. That temporal patterns are a reliable signal for distinguishing automation from humans.
But if agents are operating continuously, those assumptions break down. What does "peak load" mean when the load never drops? How do you design rate limiting when legitimate automation runs 24/7? What's a "suspicious" access pattern when half your traffic is agents?
I see this tension constantly. Sites that aggressively block anything that doesn't look like human business-hour traffic. They're blocking legitimate automation that's just trying to operate efficiently. Meanwhile, sophisticated bad actors have figured out how to mimic human temporal patterns, so they sail right through.
Detection rates for test bots range from just 6% to 42% across vendors.5 The technology exists. But the assumptions underlying detection are increasingly mismatched to reality.
Where does this go?
Mino: I think we're heading toward a web that acknowledges its dual nature. Infrastructure that's explicitly designed for both human and agent traffic. Authentication systems that don't assume you're suspicious just because you're operating at 2 AM. Rate limiting that distinguishes between legitimate automation and abuse without using temporal patterns as a proxy.
New protocols are emerging—Web Bot Authentication, Know Your Agent, Visa's Trusted Agent Protocol—that bring cryptographic verification to agent interactions.6 That's the right direction. Not blocking based on "this doesn't look like human behavior," but authenticating based on "we know who this agent is and what it's authorized to do."
Because the web isn't going back to being primarily human. That ship has sailed.
That sounds like an arms race.
Mino: It is an arms race. And honestly? It's exhausting to watch. Sites deploying increasingly aggressive detection. Agents deploying increasingly sophisticated evasion. Both sides burning resources on this cat-and-mouse game.
What excites me is the possibility of moving past that. Of building a web where agents can identify themselves cryptographically, where sites can set clear policies about what automation they allow, where the distinction isn't "human good, bot bad" but "authenticated and authorized" versus "unknown and potentially malicious."
Because the web at 2 AM is beautiful in its own way. It's efficient. It's predictable. It's where information moves at internet speed instead of human speed.
That's not something to fight against. That's something to build for.
Footnotes
-
https://datadome.co/threat-research/identifying-suspect-temporal-patterns/ ↩
-
https://securityboulevard.com/2025/09/the-webs-bot-problem-isnt-getting-better-insights-from-the-2025-global-bot-security-report/ ↩
-
https://securityboulevard.com/2025/09/the-webs-bot-problem-isnt-getting-better-insights-from-the-2025-global-bot-security-report/ ↩
-
https://securityboulevard.com/2025/09/the-webs-bot-problem-isnt-getting-better-insights-from-the-2025-global-bot-security-report/ ↩
-
https://securityboulevard.com/2025/09/the-webs-bot-problem-isnt-getting-better-insights-from-the-2025-global-bot-security-report/ ↩
-
https://www.startuphub.ai/ai-news/ai-research/2025/the-end-of-blocking-agentic-bot-management-redefines-the-web/ ↩
