Whether you love or hate AI, it’s here to stay and will continue affecting how users search and receive information throughout 2025.

Between Gemini 2.0, AI overviews, ChatGPT and AI assistants, there’s a lot going on and developments are being made consistently to improve and advance this technology for a better user experience. And while I can’t predict the future (wouldn’t that be nice?), I believe AI will have a great impact on search in 2025 as it continues to improve and increase understanding of user queries.

Conversational search

As AI improves to further develop ‘human-like’ responses, we find ourselves adapting alongside it as our search queries and inputs become more conversational, shifting from keywords to natural language interactions. Instead of typing ‘best holiday destinations’, you might ask something like, ‘where are the best places to travel to in September for a foodie?’. You’ll be able to receive personalised information quickly and then expand on it, for example, if the AI lists multiple destinations all over the world, you could write ‘narrow these down to Europe only’ and will be provided with choices that are perfectly relevant to your preferences. Users are able to ask complex, multi-step questions where the AI can refine and adapt its answers in real time, almost like a dialogue.

This style of search is particularly useful for Google’s AI overviews, in which specific questions can be answered back to the user in a conversational way, using information from multiple sources to provide you with comprehensive results.

AI overviews image

Voice search

The easiest way to engage with conversational search queries would be through voice search, whether it’s an AI assistant such as Siri and Alexa or using Google’s ‘search by voice’ function. Google states that this type of search works well with their featured snippets which are displayed concisely on top of the SERP. According to analysis from Backlinko, 40.7% of all voice search answers come from a featured snippet while 74.9% of Google Home results came from a page ranking top 3 in that keyword. Users can easily get information via voice search as well as utilise their AI assistant to update their calendar, order shopping or set a timer hands-free. Even our pets have figured out how to use AI, with one naughty parrot reportedly using Alexa to buy random items online!

Where voice search results tend to rank image

Source: Backlinko

Using ChatGPT to search

If you’ve had enough of Google, you can rid yourself of it altogether and instead utilise ChatGPT to search the web for you and provide you with content, links and answers in a conversational manner. Not only can users gather information naturally, they can also circumvent going on websites altogether as ChatGPT is even capable of booking restaurants and events for you via the new Operator feature.

What does this mean for search?

Searching with AI is like a new and shiny toy, getting regular improvements and new capabilities, which lead to users flocking to try it out and see what it can do. It might mean people are searching less on traditional search engines, and instead opting to use AI like ChatGPT to answer their queries[AH1] … But in reality, Google still has the monopoly on search and it doesn’t look to be easing up anytime soon with around 5 billion users compared to ChatGPT’s 300 million and roughly 1000% more usage than it’s nearest AI competitors… for now…

When using Google to search, AI overviews are at the forefront of the SERP, so what do we do about it? Businesses need to decide whether they want to compete with or optimise content for AI overviews, and I’d love to get more clarity in 2025 on how to do both.

Website visits google vs AI image

Multimodal search

Google introduced Gemini 2.0 in December 2024, which included advancements in the multimodal search and how the AI understands information across text, image, video and audio searches. With improvements to multimodal search, users will be able to reliably use a variety of options to input their queries, such as uploading an image of an outfit to find out where to shop for the clothes. Lykdat is an AI-powered fashion search engine that uses visual images in this way to help users find and compare products. Google Lens itself is used for nearly 20 billion visual searches every month.

And for the video editors out there - Adobe Premiere Pro have improved their media intelligence AI so it can recognise and find video clips based on a written prompt. It is able to identify objects, locations, camera angles and spoken word as well as any metadata attached to the video files (dates, camera types).

Lykdat AI fashion tool image

What does this mean for search?

There should be an increased focus on multimodal utilisation when optimising pages, ensuring that the content is easily accessible, engaging and relevant across multiple formats. For example, images should be of high quality with detailed metadata and videos should be captivating and informative with closed captions for further understanding. We may see SERPs becoming more cluttered with different types of output with (hopefully) relevant images and videos based on search queries. Embracing multiple formats can take content to the next level, whether you include a video explanation with your blog post or create a clear and helpful infographic. While this may be a daunting task for those who aren’t confident in graphics or editing, there are even AI tools that can help with brainstorming visual ideas, such as ClickUp Brain, and tools that can make the graphics for you, like Piktochart – see what it managed to mock up below.

AI infographic image

Deep contextual understanding

As AI is further improved to gain a deeper contextual understanding of user search intent, preferences and past interactions, we’ll be expected to receive a more personalised search experience with information tailored to specific needs.

While AI results may become more intuitive and efficient for user queries, a study from the University of Washington suggests that there is a trade-off on the reliability of generative AI. When we use a search engine to ask a question, we can browse to find the sources we trust to gather information but putting all our trust in AI when searching to come up with one answer is not as reliable, especially as it’s not even always right!

What does this mean for search?

AI’s deeper contextual understanding is expected to transform search into a more personalised, efficient and intuitive experience, while also presenting new challenges that need to be addressed such as factual inconsistencies, reliability and biases. To adapt our content alongside this development, businesses should target long-tail keywords and content that answers specific queries that can easily be interpreted by AI and users, with an emphasis on understanding and mapping user intent. For example, analysing keywords and conversational queries, deciding if they are informational, transactional or navigational as well as the micro intent and creating content accordingly.

Search intents micro intents image

Source: Search Engine Land

Final thoughts

Search is transforming alongside AI as technology improves, and it will take time to figure out the nuances of creating content that works with these developments. While improvements are supposed to improve user experience and make gathering information or completing tasks easier, there are still limitations to utilising AI when considering reliability of information, over-generalisation of answers and misinterpretation of input. Jumping on the bandwagon to learn and adapt to the new search landscape as it changes (for better or worse) will be important for 2025.

Back to blog
Meet the author ...

Jasmine Gambrell

SEO Executive

With an ever-growing interest in digital marketing, Jasmine is very enthusiastic about SEO and content writing. She is passionate about all things online, spending much of her ...