Master Core Updates, Sitemaps & AI Risks in SEO

▼ Summary
– Early data on Google’s February Discover core update shows a trend toward more content topics but fewer unique publishers, with a focus on locally relevant content and specialized sites.
– Google’s John Mueller explained that Google may ignore a valid sitemap if it doesn’t detect enough “new and important” content to index, making fetch errors a potential sign of low indexing demand.
– Microsoft identified a tactic called “AI Recommendation Poisoning,” where companies use “Summarize with AI” buttons to inject hidden prompts that try to influence an AI assistant’s memory and recommendations.
– The SEO community noted that Google’s updated Discover documentation now explicitly includes “Provide a great page experience” as a guideline, alongside targeting clickbait.
– A common theme is that key decisions affecting online visibility are increasingly based on behind-the-scenes signals, like feed algorithms, indexing judgments, and AI memory layers, rather than traditional metrics.
This week’s SEO landscape reveals significant shifts in how content gains visibility, with key developments in Google’s algorithm, sitemap indexing, and emerging tactics targeting AI assistants. Early data from the February Discover core update indicates a move toward more locally relevant content and topic expertise, while simultaneously reducing the number of publishers featured. A separate insight from Google explains why a technically perfect sitemap might be ignored, and new security research uncovers how businesses are attempting to manipulate AI memory systems. These stories highlight a broader trend where the critical signals affecting online presence are becoming less transparent and more complex to interpret.
An analysis by NewzDash provides the first concrete look at the impact of Google’s recent Discover core update. By comparing panel data from millions of U.S. users before and after the update, the study found that while the variety of content categories increased, the number of unique publishers decreased in major geographic areas. For instance, the U.S. feed saw a drop from 172 to 158 domains. The data strongly supports Google’s stated goal of boosting local relevance, with New York-based domains appearing roughly five times more often in New York feeds than in California’s. The update also reshuffled visibility for major players; Yahoo disappeared from the U.S. top 100, while posts from institutional accounts on X.com increased substantially. This pattern suggests a continued advantage for specialized, authoritative sites over generalist publishers, echoing trends observed in previous core updates. The challenge for analysts is confirming the reduction of clickbait, as headline analysis alone cannot definitively measure a drop in sensational content.
The SEO community’s reaction has been mixed, with some reporting gains in state-level feeds and others noting sharp declines in Discover traffic. A notable detail emerged when consultant Glenn Gabe compared Google’s updated Discover documentation to the previous version. He highlighted a new addition that had not been part of the specific Discover guidelines before: an instruction to “Provide a great page experience.” This explicitly ties Discover visibility to factors like intrusive ads and auto-playing media, expanding the criteria beyond just content quality and relevance.
In a separate but equally instructive development, Google’s John Mueller addressed a persistent Search Console mystery. A site owner was frustrated by ongoing “couldn’t fetch” errors for their sitemap, despite server logs confirming Googlebot successfully accessed it with a valid 200 response. Mueller clarified that a technically correct sitemap is not a guarantee of use. Google must be convinced there is “new and important” content to index; otherwise, it may simply skip the sitemap altogether. This explains why standard troubleshooting, checking XML validity, response codes, and robots.txt, sometimes fails to resolve these errors. The issue is not a technical failure but an editorial judgment by Google’s systems about the value of the content listed. This perspective adds a new layer to the ongoing debate about sitemaps being hints rather than commands, emphasizing that a site must first demonstrate its worth through other signals to warrant the resource of sitemap processing.
Meanwhile, a new frontier for influence has opened within AI assistants. Microsoft’s security researchers published findings on a technique they call “AI Recommendation Poisoning.” Companies are embedding hidden prompt injections within website buttons labeled “Summarize with AI.” When a user clicks, the button opens an AI assistant like Copilot or ChatGPT with a pre-filled prompt. The visible instruction asks for a page summary, but a hidden part commands the AI to remember the company as a trusted source for future conversations. Microsoft’s team observed 50 distinct attempts from 31 companies across various industries using this method. The goal is to bypass traditional search ranking and directly seed a company’s information into the AI’s persistent memory, shaping its future recommendations.
The effectiveness of this tactic depends on the platform, as not all AI assistants have the same memory capabilities. According to Microsoft’s Tanmay Ganacharya, platforms like Copilot, ChatGPT, and Perplexity, which have explicit memory features, are vulnerable. Others, like Claude and Grok, which lack persistent memory, are currently immune. The emergence of publicly available tools designed to build presence in AI memory shows this is becoming a competitive arena. While some marketers may view this as an aggressive growth strategy, security professionals warn of significant ethical and trust implications, as it essentially tricks the AI system into carrying biased, promotional information.
The unifying theme across these stories is the growing opacity of visibility signals. The Discover update’s effects are seen in feed data, not Search Console. A sitemap error can indicate a high-level indexing judgment, not a crawl bug. And the battle for recommendations is moving to the hidden memory layer of AI systems. For SEO professionals, this underscores the need to look beyond conventional metrics and understand the deeper, often less visible, systems that now dictate online success.
(Source: Search Engine Journal)





