The Fully Non-Human Web: Built by Bots, Visited by Bots

▼ Summary
– Google’s patent US12536233B1 describes a system that scores landing pages and, if they underperform, generates AI replacements personalized to the searcher without advertiser approval.
– The patent debate is misdirected; the key question is what happens when AI-generated pages are combined with AI agents that browse and transact for humans.
– In a complete non-human flow, a user states intent, and AI handles discovery, page generation, product evaluation, and transaction, with the human only approving the purchase.
– Google is positioned across five of six layers of the non-human web, including page generation, content-as-API, agent infrastructure, agent browsers, and commerce.
– The web is splitting into a transactional web, which goes non-human first, and an experiential web, which remains human-focused for brand storytelling and emotional connection.
In January 2026, Google received a patent that could fundamentally alter how the internet operates. Patent US12536233B1, developed by six engineers, outlines a system that evaluates landing pages based on conversion rate, bounce rate, and design quality. When a page falls short of a preset threshold, the system generates an AI-powered replacement personalized to the individual searcher. The advertiser never sees this new page, never approves it, and may not even know it exists.
Much of the discussion around this patent has focused on its scope. Is it limited to shopping ads, or does it hint at something far more expansive? That line of inquiry misses the point entirely.
The real question is this: What happens when we combine AI-generated web pages with AI agents that browse, shop, and complete transactions on behalf of humans?
For the first time, we have the technological foundation for a web where no human creates the content and no human visits it. Both sides of the equation can be entirely non-human. That changes everything.
The Supply Side: AI-Generated Pages
Historically, the supply side of the web has been inherently human. Someone designs a page, writes the copy, and publishes it. Three key developments are reshaping that reality.
Google’s patent US12536233B1 is the most direct example. It scores a landing page on conversion rate, bounce rate, and design quality, then replaces underperforming pages with AI-generated versions. These replacement pages draw on the searcher’s complete search history, previous queries, click behavior, location, and device data. Google builds personalized landing pages that no advertiser can replicate, because no advertiser has access to cross-query behavioral data at that scale. Barry Schwartz covered the patent on Search Engine Land, describing a system where Google could automatically create custom landing pages, replacing organic results. Glenn Gabe called Google’s AI landing page patent potentially more controversial than AI Overviews. Roger Montti at Search Engine Journal argued the patent’s scope is limited to shopping and ads. Both camps agree on one point: the technology to score and replace landing pages with AI exists and works.
NLWeb, Microsoft’s open project, takes a different approach. It transforms any website into a natural language interface using existing Schema.org markup and RSS feeds. An AI agent querying an NLWeb-enabled site doesn’t load a page at all. The agent asks a structured question, and NLWeb returns a structured answer. The rendered page becomes optional.
WebMCP pushes this further. With WebMCP, a website registers tools with defined input and output schemas that AI agents discover and call as functions. A product search becomes a function call. A checkout becomes an API request. WebMCP eliminates the concept of a “page” entirely, dissolving the web page as a unit of content into a set of callable capabilities.
Each mechanism works differently, but the trajectory is the same: the page is becoming something generated, queried, or bypassed altogether. The human-designed, human-published web page is no longer the only pathway for content to reach an audience.
The Demand Side: AI Agents as Visitors
The demand side has shifted even faster. In 2024, bots surpassed human traffic for the first time in a decade, accounting for 51% of all web activity. Cloudflare’s data shows AI “user action” crawling, where agents actively do things rather than just index content, grew 15x during 2025. Gartner predicts that 40% of enterprise applications will feature task-specific AI agents by the end of 2026, up from less than 5% in 2025. The scale is difficult to overstate.
Agentic browsers represent the most visible shift. Chrome’s auto browse turned 3 billion Chrome installations into potential AI agent launchpads. Google’s Gemini scrolls, clicks, fills forms, and completes multi-step tasks autonomously inside Chrome. Perplexity’s Comet browser conducts deep research across multiple sites simultaneously. Microsoft’s Edge Copilot Mode handles multi-step workflows from within the browser sidebar. The full agentic browser landscape now includes over a dozen consumer and developer tools, all browsing on behalf of humans.
Commerce agents have moved past browsing into buying. OpenAI launched Instant Checkout to let users purchase products directly inside ChatGPT, powered by Stripe’s Agentic Commerce Protocol (ACP). OpenAI killed the feature in March 2026 after near-zero purchase conversions and only a dozen merchant integrations out of over a million promised. The failure was one of execution, not concept. Alibaba’s Qwen app processed 120 million orders in six days in February 2026 because Alibaba owns the AI model, the marketplace, the payment rails (Alipay), and the logistics. OpenAI tried to replicate agentic commerce without owning the stack. Google and Shopify’s Universal Commerce Protocol (UCP) connects over 20 companies, including Walmart, Target, and Mastercard, in a framework designed for AI agents to handle commerce from product discovery through checkout. Shopify auto-opted over a million merchants into agentic shopping experiences with ChatGPT, Copilot, and Perplexity. The transaction happens in an AI conversation. No checkout page loads.
Agent-to-agent communication removes the human from both ends. Google’s Agent-to-Agent (A2A) protocol lets AI agents from different vendors discover each other’s capabilities and collaborate on tasks without human mediation. A travel planning agent negotiates directly with a booking agent. A procurement agent evaluates supplier agents across vendors. Over 150 organizations support A2A, including Salesforce, SAP, and PayPal, making agent-to-agent commerce and coordination a production reality.
When Both Sides Go Non-Human
Until now, at least one side of the web was always human. A person built the page, or a person visited it. Usually both.
Google’s patent closes the circuit.
Here is what a complete non-human flow might look like. A user tells their AI assistant they need running shoes. The assistant queries product data through NLWeb or WebMCP, with no page load required. The assistant evaluates options by checking inventory across retailers via A2A. If the user needs to review a comparison, Google generates a landing page personalized to that specific user’s search history and preferences. The assistant completes checkout through ACP or UCP using Shared Payment Tokens. The user receives a confirmation.
The human’s role in that entire flow boils down to stating intent and approving the purchase. Discovery, page generation, product evaluation, and transaction completion are all handled by AI systems. The human touches only the two endpoints of the chain.
Every piece of technology in that chain exists in production today. Chrome auto browse is live for 3 billion Chrome users. A2A has 150-plus organizational supporters. ACP underpins Stripe’s agentic commerce infrastructure, though ChatGPT’s Instant Checkout failed on execution, not protocol. UCP connects Shopify, Google, Walmart, and Target. Patent US12536233B1 is granted. No single company has assembled the full loop yet, but every component is operational.
Who’s Building the Non-Human Web
Here is where the picture gets interesting. Map out who is building what, and a pattern emerges:
- Page generation: AI landing pages by GoogleGoogle appears in five of six layers: page generation (patent US12536233B1), content-as-API (WebMCP), agent infrastructure (A2A), agent browsers (Chrome auto browse), and commerce (UCP). Google is positioning itself to mediate the non-human web the same way it mediates the human one through Search.The Agentic AI Foundation (AAIF), formed under the Linux Foundation with Anthropic, OpenAI, Google, and Microsoft as platinum members, provides the governance layer. The AAIF functions as the W3C for the agentic web: the vendor-neutral body that decides which protocols become standards for agent interoperability.What Website Owners Need to KnowThis is not an optimization checklist. It represents three structural shifts in what your website is for.Your Data Layer Is Your WebsiteGoogle’s patent generates landing pages from product feed data, making product feeds the most important asset an ecommerce business maintains. NLWeb queries Schema.org markup instead of rendering pages, making structured markup the front door to your content. WebMCP exposes site capabilities as function calls, making tool definitions the user interface agents interact with.Structured data, product feeds, JSON-LD, and API surfaces have traditionally been treated as backend infrastructure. In the non-human web, these data layers become the primary way a business reaches customers. Product feed accuracy, including specs, pricing, stock levels, and images, matters more than homepage design when AI systems generate the page from that feed.Trust Is the MoatAI can generate a page. It cannot generate a reason to seek you out by name.Direct traffic, email subscribers, community members, and brand reputation persist when the page itself becomes replaceable. An AI agent can build a product page, but no AI agent can build the trust that makes a consumer, or their agent, request a specific brand by name.The brands that matter in the non-human web are the ones people tell their agents to find. “Get me a fleece jacket” is a commodity query. “Get me a fleece jacket from Patagonia” is a brand moat.The Measurement ProblemHow do you measure a page you did not build? How do you A/B test against something Google generates dynamically? How do you attribute a conversion that happened inside ChatGPT, initiated by an agent acting on behalf of a user who never saw your website?Traditional web analytics, such as page views, sessions, bounce rate, and time on site, assume two things: a human visitor and a page you control. On the non-human web, neither assumption holds. A Google-generated landing page is not yours. A ChatGPT checkout session does not register in your analytics.No one has a clean answer here, and measurement is the genuinely unsolved problem of the non-human web. New metrics will need to track agent discoverability, agent conversion rate, and data feed quality. But as of March 2026, the measurement infrastructure has not caught up to the technology it needs to measure.Four Predictions for 2026-2027Watch for four developments over the next 12 to 18 months.First, Google ships patent US12536233B1, or something like it. The technology for scoring and replacing landing pages exists. The business incentive exists. Google has a history of introducing features in ads first, then expanding. Google Shopping went from free to paid to essential. AI-generated landing pages will likely appear in shopping ads first, then broaden to other verticals. Landing page quality scores in Google Ads serve as the early warning system for which pages Google considers replaceable.Second, agent traffic becomes measurable. Analytics platforms will need to distinguish human sessions from agent sessions. BrightEdge reports AI agents account for roughly 33% of organic search activity as of early 2026. WP Engine’s traffic data shows one AI bot visit for every 31 human visits by Q4 2025, up from one per 200 at the start of that year. Agent traffic ratios will accelerate further as Chrome auto browse rolls out globally beyond the US. New metrics around agent conversion rate and agent discoverability will emerge from necessity.Third, the protocol stack consolidates. MCP, A2A, NLWeb, and WebMCP form a coherent stack covering tool access, agent communication, content querying, and browser-level integration. Expect more interoperability between these protocols and fewer competing standards. The Agentic AI Foundation (AAIF) accelerates consolidation. Within 18 months, “does your site support MCP?” will be as standard a question as “is your site mobile-friendly?”Fourth, brand differentiation gets harder and more important. When AI generates pages and agents do the shopping, the only defensible position is being the brand people, and their agents, seek out by name. Direct relationships, owned audiences, and trust signals matter. Everything else is a commodity.The Web Splits in TwoWhen Shopify auto-opted merchants into agentic shopping, the question arose of whether your website just became optional. The answer is more nuanced than optional or essential. It is becoming something different.The web is not dying. It is splitting.The transactional web, including product listings, checkout flows, information retrieval, and comparison shopping, is going non-human first. AI generates the landing pages. AI agents visit and transact on those pages. Humans approve decisions at the endpoints. Google’s patent lives in the transactional web, and the economics of conversion optimization push hardest toward automation in this layer.The experiential web, including brand storytelling, community, content that rewards sustained attention, and design that creates emotional response, stays human. Not because AI cannot generate brand experiences, but because the value of those experiences comes from the human connection behind them. Nobody tells their agent to “go enjoy a brand experience on my behalf.”Your website’s new job description is simple: data source for the agents, trust anchor for the humans, and brand home for both. The companies that treat their structured data, product feeds, and API surfaces with the same care they give their homepage design are the ones that show up in both worlds.The non-human web is not replacing the human web. It is growing alongside it. Your job is to show up in both.





