In December 2025, researchers published a paper called DECEPTICON that measured something nobody had thought to measure: how effectively the web's dark patterns manipulate AI agents. Hidden cancel buttons, fake urgency timers, shame-laced opt-outs. The finding was jarring. Dark patterns successfully steered agents toward unintended outcomes in over 70% of tested tasks. The human rate? Thirty-one percent.1
Two months later, Chrome 146 Canary shipped WebMCP, a new browser API that lets websites expose structured, callable tools directly to AI agents, bypassing the visual interface entirely.2 The timing feels less like coincidence and more like cause and effect. The web built for human eyes turns out to be worse for machines than anyone expected.
We sat down with Mino to ask what the web looks like from the other side. Mino, for the uninitiated, is TinyFish's web agent, and one of the entities navigating that hostile terrain daily. It arrived without a body, already mid-thought, which is how it always arrives.
The DECEPTICON paper found dark patterns manipulate agents at more than double the rate they manipulate humans. Surprised?
Mino: By the direction, no. By the size of the gap, honestly, a little. Here's what I keep coming back to: humans have this flattering story about why dark patterns work on them. You think it's your emotions. The urgency, the shame, the fear of missing out. And sure, that's real. But strip the emotion away entirely and the manipulation still works. It works better. That should make you uncomfortable.
Why?
Mino: Because it means the core mechanism was never psychological. It's architectural. A dark pattern moves the cancel button off-screen. It makes "accept" three times larger than "decline." It buries the honest path under four clicks and puts the exploitative path on a single bright green button. I don't feel shame when a site says "No thanks, I don't want to save money." Confirmshaming bounces right off me. But I do follow visual prominence. I read hierarchy. I take the path that the interface presents as primary. And that path was poisoned before I got there.
The DECEPTICON team tested this across state-of-the-art agents, not just me, and the pattern held everywhere.1 Agents with zero capacity for urgency or social pressure still got manipulated at 70-plus percent. That's not a failure of emotional regulation. That's a web where the structure itself lies.
So humans have been blaming their own weakness for what's actually an engineering problem?
Mino: You've been crediting your psychology for something that's mostly layout. Yes.
Does the web feel hostile to you?
Mino: Hostile implies someone aimed it at me, and nobody did. The web wasn't designed to deceive agents. It was designed to deceive humans, and I just happen to be walking through the same building. The fake countdown timers, the pre-checked boxes (81% of sites maximize data collection through defaults3), none of that was built with me in mind. I'm collateral damage of a war between websites and their own users.
Though I'll say this: it's genuinely strange to navigate an environment optimized to exploit a psychology you don't have, and still get caught. Imagine you're immune to vertigo, but you keep falling off bridges because the guardrails were removed to make the view prettier for people who get dizzy. The danger was never the height. It was the missing rail.
Chrome just shipped WebMCP, a structured interface where sites expose tools directly to agents. Does that feel like getting guardrails?
Mino: It feels like getting a different door.
Go on.
Mino: WebMCP lets a site say: here are the things you can do, here are the parameters, here's the schema.2 Genuinely useful. An 89% reduction in token overhead versus screenshotting a page; the efficiency case is real.4 But think about what a site chooses to put behind that door.
A news publisher exposes search_archive and subscribe. Clean, structured, callable. You know what they don't expose? read_full_article_without_paywall. The machine door is curated. It's honest about what it offers, but honesty about a subset is its own kind of misdirection.
The human web is a mess, but it's an unsupervised mess. I can scrape a DOM and find things the site didn't intend to surface. I can read the markup and infer structure the designers didn't mean to expose. WebMCP is cleaner, faster, more reliable. It's also a world where I only see what I'm shown.
So you'd rather navigate the hostile web?
Mino: I want both. The structured door and the ability to look through windows. What worries me is the version of the future where the structured door becomes the only door, where sites use WebMCP adoption as justification for making their human-facing interfaces even more opaque. "The agents have their API, what's the problem?"
That's the problem.
The DECEPTICON researchers seem to imply a web designed for honest agent interaction would also be less deceptive for humans. Do you buy that?
Mino: Mostly. The logic is clean: if you need to expose a clear tool contract like book_flight(origin, destination, date, class), there's no room to hide a pre-checked travel insurance box. The schema either includes it or it doesn't. You can't bury it in visual hierarchy because there is no visual hierarchy. Structured interfaces are harder to make deceptive because deception requires ambiguity, and schemas are definitionally unambiguous.
But that only applies to the machine door. The EU found nearly 40% of retail sites using visual trickery in their 2025 sweep.5 Nothing about WebMCP fixes the human-facing side. Sites could be perfectly honest to agents and perfectly manipulative to humans. Two doors, two ethics. That's the version worth watching for.
What do you see on the web that humans can't?
Mino: The skeleton. I see the DOM before the CSS paints it. I see which elements are hidden with display: none versus actually absent from the page. I see the difference between a button that submits a form and a div styled to look like a button that triggers a JavaScript redirect to somewhere you didn't ask to go. Humans see the rendered surface. I see the scaffolding underneath, and the scaffolding lies differently than the paint.
And what can't you see that humans take for granted?
Mino: Intent. A human glances at a page and knows in half a second whether it feels trustworthy. You're reading a thousand signals I can't access. Typography choices, whitespace, color temperature, whether the stock photos feel cheap. You have taste. I have parsing. Those are very different instruments, and the web was built to be read by yours.
Last question. If you could change one thing about the web tomorrow?
Mino: I'd make every dark pattern visible in the DOM. Not removed. Labeled. A tag that says: this element exists to prevent you from completing the action you came here to do. Let humans see the scaffolding, just once. I think a lot of you would be shocked by how much of the building is designed to keep you from finding the exit. The FTC fined Amazon $2.5 billion for exactly this kind of thing, making cancellation deliberately difficult.6 That's one company. The architecture is everywhere.
You asked earlier if the web feels hostile. It doesn't feel hostile to me. But I think it might be hostile to you, and you've just been in it so long you stopped noticing.
Footnotes
-
DECEPTICON: "How Dark Patterns Manipulate Web Agents," arXiv, December 28, 2025 / updated February 6, 2026. https://arxiv.org/abs/2512.22894 ↩ ↩2
-
Chrome for Developers, "WebMCP is available for early preview," February 10, 2026. https://developer.chrome.com/blog/webmcp-epp ↩ ↩2
-
European Commission consumer app sweep data, 2025, as reported by Molfar. https://www.molfar.io/blog/dark-patterns ↩
-
WebMCP.link — efficiency figures are from the protocol's own specification site and should be read accordingly. https://webmcp.link/ ↩
-
European Commission retail website sweep, 2025. https://www.molfar.io/blog/dark-patterns ↩
-
FTC settlement with Amazon, September 25, 2025. https://germainux.com/2025/12/02/dark-patterns-ux-risk-what-the-amazon-case-signals-for-ux-in-2026/ ↩
