One price every 36 seconds. That's the rhythm of manual price checking when someone's efficient and the websites cooperate. Open tab, wait for load, scroll to product, copy price, switch to spreadsheet, paste. Repeat. For hours.
About 100 prices per hour is what pricing intelligence vendors measure as the upper limit of this workflow. Meanwhile, Amazon reprices products every ten seconds. The person checking prices manually isn't competing with other humans anymore. They're racing algorithmic systems that never stop, and 73% of Amazon sellers still work this way.
A Morning of Price Checks
The manual checker starts their morning routine. First twenty sites load fine. Then check number 47 hits a CAPTCHA—select all the traffic lights. The rhythm breaks. Authentication times out on check 63. Log back in.
The competitor site that was fast yesterday is inexplicably slow today. Wait. Refresh. Wait again.
Site 82 redesigned overnight. The price used to be right there, top right corner. Now it's... somewhere. Scroll down. There. Different layout, same number, but it took three extra minutes to find. Mental note: update the checking sequence. By tomorrow, muscle memory will adapt.
The manual checker navigates complexity that only becomes visible when you try to systematize it. They handle authentication by logging in interactively. They solve CAPTCHAs when they appear. They space out checks naturally, never triggering rate limits. When a site changes, they adapt instantly. This adaptability is their advantage. It's also why replacing them is harder than it looks.
What Automation Has to Handle
Building web automation at scale means explicitly handling every challenge the manual checker navigates unconsciously. That CAPTCHA at check 47? Automation has to detect it, route around it, or solve it programmatically. The authentication timeout? Now you're managing session state across hundreds of sites simultaneously, each with different timeout behavior, different cookie requirements, different security patterns.
The site redesign? The manual checker just looked for the number in a new location. Automated systems break silently. The price that was in a <span class="price"> is now in a <div class="sale-price">. The scraper returns null for three days until someone notices the gap in the data.
When we're building enterprise web agent infrastructure, this pattern emerges everywhere: the web wasn't built for systematic monitoring. It was built for humans with browsers, making occasional visits, behaving unpredictably. Authentication expects interactive login. Rate limits trigger when patterns seem too regular. Bot detection fires when behavior looks too systematic. Regional variations show different prices depending on location signals.
The manual checker handles all of this through human adaptability. Automation requires infrastructure—session management, error recovery, monitoring systems that know when things break, fallback strategies for when sites change. Depth that goes far beyond "just scrape the website."
Why It Persists
Manual tracking breaks down around 20 products—not because humans can't check more, but because maintaining accuracy becomes "a full-time job."
Yet nearly 83% of companies still work this way in early stages.
The work persists because replacing it properly means solving problems most teams underestimate. Authentication across fragmented systems. Bot detection that doesn't break workflows. Site changes that happen without warning. Scale that requires infrastructure rather than scripts. Until those problems get solved, the 36-second rhythm continues.
The alternative looks simple until you try to build it.
Things to follow up on...
-
The maintenance burden: Even automated price tracking systems face ongoing challenges because every time a competitor tweaks their site or adds anti-bot measures, scrapers break, requiring continuous technical attention.
-
Walmart's repricing velocity: While Amazon changes prices every ten seconds, Walmart reprices products 50,000 times per month, creating competitive pressure that manual monitoring simply cannot match.
-
The spreadsheet productivity drain: UK accounting firms using manual spreadsheet-based workflows lose approximately £5.3 billion annually, representing nearly 16% of potential revenue lost to outdated manual processes.
-
Restaurant pricing intelligence scale: Some industries require monitoring at extraordinary scale, with one restaurant pricing vendor tracking prices monthly across 235,000+ locations to provide competitive intelligence.

