Google AI Gets Personal: New Crawl Limits Explained

▼ Summary
– Google has made its Personal Intelligence feature, which personalizes AI Mode responses using Gmail and Photos data, free for all personal account users in the United States.
– Google revealed that its Googlebot crawl limits, like the often-cited 15MB threshold, are flexible and can be overridden, with Google Search typically using a smaller 2MB limit in practice.
– A study in Germany found AI Overviews cut the click-through rate for the top organic search position by 59%, indicating a significant loss of publisher traffic similar to trends observed in the US.
– Search referral traffic has declined sharply over two years, with small publishers losing 60%, mid-sized losing 47%, and large publishers losing 22%, while alternative sources like ChatGPT referrals remain minimal.
– The week’s data illustrates that general industry benchmarks are becoming less universally applicable, as outcomes now heavily depend on specific context, site size, and user personalization.
Recent updates from Google are reshaping the digital landscape, directly impacting how AI interacts with personal data, how search engines crawl websites, and how publishers are experiencing traffic shifts. These changes demand a closer look from anyone invested in online visibility and content strategy.
Google has made its personalized AI feature, known as Personal Intelligence, available for free to all users with personal accounts in the United States. Previously a perk for paying subscribers, this tool now integrates with Gmail and Google Photos to tailor AI-generated responses. For instance, if you ask about an upcoming trip, the AI can reference flight confirmations from your inbox or photos from a previous visit. This rollout is currently limited to the US and does not include business-focused Workspace accounts. The significant expansion of this user base means that search results in AI Mode will become highly individualized, making it far more challenging to predict or benchmark what information the AI will surface for any given query.
In a related technical shift, Google has clarified that the crawl limits for its Googlebot are not the rigid rules many assumed. While a 15-megabyte limit has been widely cited for years, the practical threshold for Google Search is often closer to 2 megabytes. Engineers can adjust these limits internally based on the specific content being crawled. For most websites, this lower threshold isn’t an issue, but pages laden with extensive scripts, large data sets, or embedded media could find their content truncated during indexing. This insight refines long-standing SEO guidance and highlights the importance of optimizing page size for core content.
The impact of Google’s AI Overviews is becoming clearer with new data from Germany. Research indicates that when an AI Overview appears, the click-through rate for the top organic search result plummets by 59%. These AI-generated summaries now appear for roughly one in five search queries in the German market, leading to an estimated loss of hundreds of millions of organic clicks each month. This pattern mirrors earlier findings in the United States, confirming that informational content is bearing the brunt of this traffic diversion. As one industry expert noted, citations within these AI summaries offer little benefit if users don’t click through, pushing publishers to focus on unique value that AI cannot easily replicate, such as breaking news or deep analysis.
The consequences of these changes are not felt equally across the publishing world. A breakdown by outlet size reveals a stark disparity: over the past two years, small publishers have seen a devastating 60% drop in search referral traffic. Mid-sized publishers faced a 47% decline, while larger organizations experienced a more manageable 22% reduction. These larger entities are somewhat buffered by stronger direct traffic, email newsletters, and app usage. Although referrals from sources like ChatGPT have grown dramatically, they currently represent less than 1% of total publisher page views—nowhere near enough to compensate for the losses from search. This data underscores a critical threat to the diversity of the information ecosystem, particularly affecting local and smaller news operations.
A common thread runs through this week’s developments: traditional benchmarks are losing their universal relevance. The “standard” crawl limit isn’t so standard, AI Overview click loss is a consistent global trend, and personalized AI means there is no single answer for what a query returns. The takeaway is that data must be interpreted through the lens of your specific industry, the scale of your website, and the habits of your audience. Relying on general assumptions is a strategy that no longer holds.
(Source: Search Engine Journal)




