Welcome to this week’s Pulse for SEO: Updates impact how Google ranks content in Discover, how to monetize AI search, and what content you serve to bots.
Here you will find out what is important for you and your work.
Google releases Discover-only core update
Google released the Discover core update in February 2026, a major ranking change targeting the Discover feed rather than Search. The rollout can take up to two weeks.
Important facts: The update is initially limited to English-speaking users in the US. Google plans to expand to more countries and languages, but has not provided a timeline. Google described it as designed to “improve the overall quality of Discover.” Existing core updates and Discover guides apply.
Why this is important for SEOs
Google has historically integrated Discover ranking changes into broader core updates that also impacted search. The announcement of a Discover-specific core update means that rankings in the feed can now move without correspondingly changing search results.
This distinction creates a monitoring problem. If you’re tracking performance in Search Console, you should independently review Discover traffic over the next two weeks. Traffic drops that look like a core update penalty can only occur on Discover. Treating them as search problems leads to a false diagnosis.
The concentration of Discover traffic has increased for publishers. John Shehata, CEO of NewzDash, reported that Discover accounts for about 68% of Google-sourced traffic to news sites. A core update that independently targets this interface raises the stakes for every publisher that relies on the feed.
Read our full coverage: Google releases Discover-focused core update
Alphabet’s fourth quarter results reveal AI mode monetization plans
Alphabet reported fourth-quarter 2025 results and showed search revenue rose 17% to $63 billion. The call included a first detailed look at how Google plans to monetize AI Mode.
Important facts: CEO Sundar Pichai said that queries in AI mode are three times longer than traditional searches. Chief Business Officer Philipp Schindler described that the resulting ad inventory reached inquiries that were “previously difficult to monetize.” Google is testing ads under answers in AI mode.
Why this is important for SEOs
The monetization details are more important than the sales headline. Google views AI Mode as additional inventory, not a replacement for traditional search ads. Longer searches create new advertising spaces that weren’t there when users entered a three-word search. For paid search practitioners, this means new territory for conversational query campaigns.
The metrics celebrated by Google in this call describe how users stay on Google longer. Google has cited longer AI mode sessions as a growth driver, and its monetization infrastructure follows that logic. The trade-off to consider is referral traffic.
AI Mode creates a seamless path of AI overviews, as detailed in our coverage last week. The earnings data suggests that Google sees this containment as part of the growth story.
Read our full coverage: Alphabet Q4 2025: AI mode monetization testing and search revenue growth
Mueller rejects granting discounts to LLM bots
Google search advocate John Mueller rejected the idea of providing Markdown files to LLM crawlers instead of standard HTML. He called the concept “a stupid idea” on Bluesky and raised technical concerns on Reddit.
Important facts: One developer described plans to provide raw Markdown to AI bots to reduce token usage. Mueller questioned whether LLM bots can recognize Markdown on a website as anything other than a text file or follow its links. He asked what would happen to internal linking, headers and navigation. At Bluesky, he was more direct, calling the conversion “a stupid idea.”
Why this is important for SEOs
This practice exists because developers assume that LLMs process Markdown more efficiently than HTML. Mueller’s answer treats this as a technical issue, not an optimization. Removing pages on Markdown can remove the structure that bots need to understand relationships between pages.
Mueller’s technical guidance is consistent, including his advice on multi-domain crawling and his crawl slump guidance. This fits a pattern where Mueller draws clear boundaries for bot-specific content formats. He previously compared llms.txt with the keywords meta tag and SE Ranking’s analysis of 300,000 domains found no connection between owning an llms.txt file and LLM citation rates.
Read our full coverage: Google’s Mueller calls Markdown for bots idea ‘a stupid idea’
Google reports bugs against WooCommerce plugins due to crawling issues
Google’s Search Relations team said on the Search Off the Record podcast that it had reported bugs with WordPress plugins. The plugins generate unnecessary crawlable URLs through action parameters such as add-to-cart links.
Important facts: Certain plugins create URLs that Googlebot recognizes and tries to crawl. The result is wasted crawl budget on pages with no search value. Google has reported a bug with WooCommerce and reported other plugin issues that have not yet been resolved. The team’s response was aimed at plugin developers and did not expect individual websites to fix the problem.
Why this is important for SEOs
Google’s intervention at the plugin level is unusual. Typically, crawling efficiency falls on individual websites. Reporting bugs in advance suggests that the problem is so widespread that one-time fixes cannot solve it.
Ecommerce websites running WooCommerce should check their plugins for URL patterns that generate crawlable action parameters. Check your crawl statistics in Search Console for URLs that contain shopping cart or checkout parameters that should not be indexed.
Read our full coverage: Google’s crawl team has reported bugs against WordPress plugins
LinkedIn shares what worked for AI search visibility
LinkedIn released insights from internal testing on what increases visibility in AI-generated search results. The company reported that non-brand awareness traffic for a subset of B2B topics fell by up to 60% across the industry.
Important facts: LinkedIn’s testing found that structured content performs better for AI citations, particularly pages with named authors, visible references, and clear publication dates. The company is developing new analytics to identify a traffic source for LLM-driven visits and monitor LLM bot behavior in CMS logs.
Why this is important for SEOs
What caught my attention was how much this overlaps with what the AI platforms themselves are saying. Search Engine Journal’s Roger Montti recently interviewed Jesse Dwyer, head of communications at Perplexity. The AI platform’s own guidelines on the causes of citations largely align with LinkedIn’s findings. When both the cited source and the citing platform independently reach the same conclusions, you get something that goes beyond speculation.
Read our full coverage: LinkedIn shares what works for AI search visibility
Topic of the week: Google splits the dashboard
Every story this week points to the same insight. “Google” is no longer a thing to monitor.
Google is now announcing core Discover updates separately from core Search updates. AI mode offers ad formats and checkout features not present in traditional results. Mueller drew a political line about how bots consume content. Google has reported crawling errors at the plugin level and LinkedIn is building a separate measurement for AI-driven traffic.
A year ago, you could look at a traffic graph in Search Console and get a reasonable idea. The picture now fragments into Discover, Search, AI Mode and LLM-driven traffic. Ranking signals and update cycles are different and the gaps between them have not been closed.
Top stories of the week:
This week’s coverage included five developments in the areas of Discover updates, search monetization, crawl policy, and AI visibility.
Additional resources:
Featured Image: Accogliente Design/Shutterstock
Follow us on Facebook | Twitter | YouTube
WPAP (907)