Welcome to our round up of the latest SEO news. These are the stories that have caught our eye.
- Google Search Console does not report half of all search queries
- Does SEO still matter for AI search engines?
- Google uses CTR as a ranking factor
- Gary Illyes warns that AI agents will flood the internet with traffic
- Microsoft Bing pushing IndexNow for faster and more accurate updating of AI search and shopping results
- Google patents using contextual signals in LLM AI search
- Sergey Brin on how AI is transforming search
Google Search Console does not report half of all search queries
Research from ZipTie found that Google Search Console was not reporting on 50% of search queries that drove traffic to websites. The research found that Google Search Console overlooks conversational searches, which are the natural language queries typically used when searching with AI chatbots and voice assistants.
The experiment involved searching Google using the same conversational query on different devices and accounts over a period of several days, then clicking through to the site. The traffic was verified using analytic tools that were in place.
Google Search Console had no record of this traffic.
Following on these initial findings, ten SEO professionals were asked to repeat the test.
Again, none of the conversational queries appeared in Google Search Console.
The conclusion they draw from this is that Google Search Console has a minimum search volume before it starts tracking queries.
The author of the report encourages SEOs to:
- Switch from looking at the query tab in Google Search Console to looking at the page tab.
- Look at how the page is performing instead of focusing on an individual keyword level.
- Focus on creating comprehensive content instead of targeting individual keywords.
Does SEO still matter for AI search engines?
This research, also from ZipTie, challenges the claim that AI-driven search has made traditional SEO obsolete.
After analysing 25,000 queries across platforms including Google AI Overviews, ChatGPT and Perplexity, they found that the higher you rank in Google, the more likely you are to appear in AI search across the different platforms. If you rank in position one, you have a 25% chance of being used as a source in Google’s AI Overviews.
They describe how they believe AI search engines operate on a three-step process.
First, they pre-select top documents based on relevance. They then extract pertinent content. They then synthesise answers.
While AI Overviews often draw from top-ranked sources, they can also include content from lower-ranked pages due to techniques like "query fan-out," which broadens the search scope to related subtopics.
The research concludes that while traditional search engine result pages display the best pages, AI engines instead look for the best answer. SEOs need to start asking “how do I better serve users who have a specific question?”
The traditional SEO foundations remain important; it is the content strategy that needs to adapt. Content should focus on directly answering questions, while being clearly structured and in their words ‘genuinely useful’. If you do that, you increase your chances of ranking in both traditional and AI search.

Google uses CTR as a ranking factor
In January this year, Pandu Nayak, Google’s Vice President in Search, gave evidence at the U.S. Department of Justice vs Google antitrust proceedings. In May, the DOJ published a redacted version of the interview.
Nayak explained that click through rate (CTR), is used for ranking purposes, which is something that Google has always denied.
In his deposition, he describes how Google uses CTR as well as other engagement metrics, including dwell time, as ranking factors. Content which receives more user interaction, such as higher CTR, is considered more relevant, making it more likely to rank.

Gary Illyes warns that AI agents will flood the internet with traffic
Google’s Gary Illyes has warned that there will be an increase in web traffic from AI agents and bots. On a recent episode of Google’s Search Off the Record podcast, he voiced concerns about the increase in traffic from AI bots overwhelming websites.
There has been a rise in businesses deploying AI tools. In his words, “everyone and my grandmother is launching a crawler”.
Websites need to anticipate this increase and prepare now. The four areas he identified that they need to focus on are infrastructure, access control, database performance and monitoring.
Firstly, they need to have hosting in place that will not fall over from the increased requests.
The robots.txt file can be used to control which bots you allow you to access your site, assuming that AI bots respect the rules.
Database performance will need to be optimised, assuming that the site already has a robust database in place. Queries should be optimised and caching implemented to reduce strain on the server.
It will become important to differentiate between legitimate AI agents, crawlers and those best described as malicious bots. This will most likely mean spending time looking at server log files and other analytic solutions.
In short, sites will need a strong technical foundation.
Microsoft Bing pushing IndexNow for faster and more accurate updating of AI search and shopping results
Microsoft Bing is pushing IndexNow, a protocol they developed in collaboration with Yandex that allows website owners to notify search engines when content has been added, updated or removed. Search engines that have implemented IndexNow include Microsoft Bing, Yandex and Seznam. Google have tested it, but so far have not said that they will implement it.
Pinging search engines to let them know when a page has been updated should speed up the changes appearing in search engine results. It means that you no longer need to wait on search engines to crawl the pages and reindex the content.
Google patents using contextual signals in LLM AI search
Google have filed a patent for how an AI assistant could use contextual signals to influence their answers.
This advancement allows Google to consider various factors such as a user's location, device status, prior interactions and even environmental conditions to tailor search results more precisely. For instance, if a user has already purchased movie tickets and then searches for the movie title, the system might prioritise reviews or trivia over showtimes, recognising that the user is likely seeking supplementary information rather than additional tickets.
In many ways this feels like AI search is catching up with traditional search. It is a major step in their evolution.
Sergey Brin on how AI is transforming search
Speaking at All-In Live at Miami, Sergey Brin, the co-founder of Google, described how AI is changing search from a process of retrieving links to one of synthesising answers from thousands of search results. The research that would previously have taken someone hours or days to do is now being done in seconds. This is a major change in how people interact with information.