Google says it is exploring updates that could allow sites to specifically opt out of AI-powered search features.
The blog post appeared on the same day that the UK Competition and Markets Authority opened a consultation on possible new requirements for Google Search, including controls for websites to manage their content in search AI features.
Ron Eden, head of product management at Google, wrote:
“Building on this framework and working with the web ecosystem, we are currently exploring updates to our controls to specifically enable websites to use generative AI search capabilities.”
Google has not provided a timeline, technical specifications or firm commitment. The post presents this as an exploration, not a product roadmap.
What’s new
Google currently offers several controls for displaying content in search, but none cleanly separates AI capabilities from traditional results.
With Google-Extended, publishers can prevent their content from being trained by Gemini and Vertex AI models. However, Google’s documentation states that Google-Extended has no influence on inclusion in Google searches and does not represent a ranking signal. It controls the AI training, not the appearance of the AI overviews.
The nosnippet and max snippet instructions apply to AI overviews and AI mode. But they also affect traditional snippets in regular search results. Publishers looking to limit the presence of AI features are currently losing snippet visibility everywhere.
Google’s post acknowledges that this gap exists. Eden wrote:
“Any new controls must avoid interfering with search in a way that results in a fragmented or confusing experience for users.”
Why this is important
I wrote in SEJ’s eBook SEO Trends 2026 that people would have more influence on the direction of search than platforms. Google’s post suggests the dynamic is playing out.
Publishers and regulators have spent the past year pushing back on AI reviews. The UK’s Independent Publishers Alliance, Foxglove and Movement for an Open Web filed a complaint with the CMA last July, demanding the ability to reject AI summaries without being completely removed from search. The US Department of Justice and the South African Competition Commission have proposed similar measures.
The BuzzStream study we reported on earlier this month found that 79% of top news publishers block at least one AI training bot and 71% block retrieval bots that impact AI citations. Publishers are already voting with their robots.txt files.
Google’s post suggests the company is responding to ecosystem pressures by exploring controls it didn’t previously offer.
Looking ahead
Google’s language is cautious. “Exploring” and “Working with the Web Ecosystem” are not product commitments.
The CMA consultation will collect information on potential requirements. Regulatory processes are slow but produce results. The EU’s investigation into the Digital Markets Act has already prompted Google to make changes in Europe.
Currently, publishers who want to limit the availability of AI features can use nosnippet or max-snippet statements. However, note that these also affect traditional snippets. Google’s Robots meta tag documentation covers the current options.
If Google follows certain opt-out controls, the technical implementation matters. Whether it’s a new robots policy, a Search Console setting, or something else depends on how practical it is for publishers to use.
Featured image: ANDRANIK HAKOBYAN/Shutterstock
Follow us on Facebook | Twitter | YouTube
WPAP (907)