Core Update, AI Ads & Crawl Policy: SEO Pulse

▼ Summary
– Google launched a Discover-specific core update in February 2026, initially for English users in the U.S., which will change rankings in the Discover feed independently of Search results.
– Alphabet’s Q4 2025 earnings revealed Google’s plan to monetize AI Mode by placing ads in longer conversational queries, treating it as new, additive ad inventory.
– Google’s John Mueller strongly advised against serving Markdown files to LLM bots instead of HTML, arguing it strips necessary page structure and is technically flawed.
– Google’s Search Relations team filed bugs against WooCommerce plugins that generate unnecessary, crawlable URLs, an unusual step to address widespread crawl budget waste.
– LinkedIn’s internal testing found that structured content with clear authorship and dates performs better in AI search citations, aligning with guidance from AI platforms themselves.
Staying ahead in search requires a sharp eye on the distinct channels that now drive visibility and traffic. This week’s developments highlight a clear fragmentation, with Google announcing platform-specific updates and new monetization paths, while technical guidance and third-party data reveal how to adapt content for the age of AI. For professionals, the unified dashboard is a thing of the past; success now depends on monitoring and optimizing for separate surfaces.
Google has launched a core update focused solely on its Discover feed, marking a significant shift in how it manages ranking changes. Historically, adjustments to Discover were bundled into broader core updates affecting traditional web search. This new, targeted update for February 2026 is initially rolling out to English-language users in the U.S., with plans for a wider international release. Google states the aim is to improve overall feed quality, and existing core update guidance still applies. The rollout could take up to two weeks to complete.
For search marketers, this separation creates a new monitoring challenge. A traffic drop in Search Console might now be isolated to Discover, not a traditional search penalty. Diagnosing a decline requires checking Discover traffic independently over the coming weeks. The stakes are high, as Discover represents a massive traffic source; some reports indicate it drives nearly 70% of Google-sourced visits to major news publishers. A dedicated core update for this surface means its ranking fluctuations are now a standalone concern.
Recent earnings from Alphabet provided the first concrete look at how Google plans to monetize its AI Search experience. While search revenue showed strong growth, the crucial details were in the commentary. Executives noted that queries in AI Mode are typically three times longer than traditional searches, creating new advertising inventory for previously hard-to-monetize, conversational queries. The company is testing ad placements directly beneath AI-generated answers.
This approach frames AI Mode as an expansion of the advertising ecosystem, not a replacement. For paid search teams, it signals new territory in managing campaigns for longer, more complex user questions. The underlying metric Google emphasized was increased user engagement time within its ecosystem. The strategic trade-off to watch is whether this contained, seamless experience comes at the cost of reduced referral traffic to external websites.
In a recent online discussion, Google’s John Mueller strongly criticized the practice of serving Markdown files to AI crawlers instead of standard HTML. A developer suggested the tactic to reduce processing tokens for large language models. Mueller dismissed the idea on technical grounds, questioning whether bots could properly interpret Markdown, follow its links, or understand page structure without standard HTML elements like headers and navigation.
His response treats this as a fundamental technical misstep, not a valid optimization. Stripping a page down to raw Markdown risks removing the semantic structure and internal linking that helps bots comprehend content and site architecture. This guidance fits a pattern where Mueller draws clear boundaries around attempts to create bot-specific content formats, previously comparing such efforts to the obsolete keywords meta tag.
In a notable move, Google’s search relations team disclosed it has filed bug reports directly with the developers of popular WordPress plugins. The issue involves certain plugins, including those for WooCommerce, that generate unnecessary, crawlable URLs through parameters like “add-to-cart” links. These URLs waste Googlebot’s crawl budget on pages with zero search value.
Google’s decision to intervene at the plugin level is unusual; typically, crawl efficiency is the responsibility of individual site owners. This action suggests the problem is so widespread that addressing it site-by-site is ineffective. Ecommerce sites using WooCommerce should audit their plugins and check Search Console crawl stats for problematic URLs containing cart or checkout parameters that shouldn’t be indexed.
LinkedIn shared insights from its internal research on what drives visibility in AI-generated search summaries. The company found that structured content with clear authorship, verifiable credentials, and publication dates performed better in earning citations. They also noted a broader industry trend of declining non-brand traffic for some B2B topics in AI search environments and are developing new analytics to track LLM-driven visits.
What’s compelling is how closely LinkedIn’s findings align with guidance from AI platforms themselves. When the sources being cited and the platforms doing the citing independently arrive at the same conclusions about content quality, it moves the advice beyond speculation into a reliable best practice.
A common theme connects all these stories: the digital landscape is no longer monitored through a single lens. Google is decoupling Discover and Search updates, building unique ad products for AI Mode, and providing distinct technical guidance for bots. Simultaneously, platforms like LinkedIn are creating separate measurement frameworks for AI-driven traffic. The integrated traffic graph of the past is fragmenting across Discover, traditional Search, AI Overviews, and LLM referrals, each with its own ranking signals and update cycles. Adapting to this new reality is the central task for modern SEO.
(Source: Search Engine Journal)





