The Web's Top Customer Is Shifting from Humans to AI
From Cloudflare and Vercel's Markdown for Agents to Google's WebMCP, reading and writing are being standardized simultaneously, ushering in the Agent-Native Web era.
The web is changing. Not for human eyes, but for AI agents that read and write.
In just the past two weeks, Cloudflare, Vercel, and Google each announced agent-friendly web standards. Here’s what this shift means for anyone building services.
Markdown for Agents Solved the ‘Reading’ Problem
The biggest waste when AI agents read web pages is HTML, CSS, and JavaScript. Essential for human eyes, but pure token waste for agents.
The approach from Cloudflare and Vercel is simple. Send the same URL with the Accept header set to text/markdown, and the server responds with a markdown-converted version. It uses content negotiation, a mechanism already built into the HTTP standard, so no new protocol was needed.
Key Points
- Vercel’s blog went from 500KB HTML to 2KB markdown (99.6% reduction)
- Cloudflare enables it with a single dashboard toggle starting from the Pro plan
- The
x-markdown-tokensheader communicates the converted token count - One URL serves both humans and agents, no separate site needed
WebMCP Tackles the ‘Writing’ Problem Head-On
Reading alone isn’t enough. Until now, agents had to parse the DOM directly to click reservation buttons and fill out forms. When the UI changed, the agent broke immediately.
Google’s WebMCP in Chrome 146 flips the approach entirely. Websites declare “what actions are available on this page” via JSON Schema, and agents can invoke tools without guessing. Think of it as Swagger for agents.
Key Points
- Just add a
toolnameattribute to an HTML form for declarative operation - The
registerToolAPI handles complex apps like SPAs - While traditional MCP is a server-side protocol, WebMCP runs inside the browser
- Already testable in Chrome 146 early preview (headless not yet supported)
These Two Movements Arriving Together Is No Coincidence
If Markdown for Agents is about “efficiently feeding content to agents,” then WebMCP is about “letting agents precisely use page functionality.” Reading and writing are being standardized at the same time.
Once both take hold, agents won’t browse the web like humans. They’ll call it like an API. The Agent-Native Web is now being built.
- The first rewrite of the web-bot contract in 20 years since robots.txt
- Token cost pressure is accelerating standardization
Builders Can Start Right Now
No overhaul required. If you’re on Cloudflare, just flip the Markdown for Agents toggle in your dashboard. Experimentally adding toolname attributes to existing forms is also a great starting point.
Things You Can Do Today
- Enable Markdown for Agents in Cloudflare Dashboard Quick Actions
- Test adding
toolnameandtooldescriptionattributes to existing HTML forms - Start logging agent traffic ratios based on Accept headers
- Evaluate running llms.txt alongside a markdown sitemap
Conclusion
If robots.txt was the first contract between the web and bots 20 years ago, the second contract is being written right now. The teams that design for agents today will own the next web.
References
Join the newsletter
Get updates on my latest projects, articles, and experiments with AI and web development.