Google overhauls search with LLMs at the core
Google is rethinking its entire search structure from the ground up, incorporating large language models as fundamental components rather than add-ons to its existing architecture.

Google has initiated a fundamental rethinking of its search technology, shifting toward a search architecture where large language models (LLMs) will play a central role rather than serving as add-on features.
Get the PPC Land newsletter ✉️ for more like this
This transformative approach came to light through Department of Justice court documents published on May 2, 2025, containing testimony from Google Search engineer Pandu Nayak from January 31, 2025. The disclosure reveals Google's most significant search restructuring since its inception.
According to confidential documents from the ongoing Google Search antitrust litigation, Google is completely redesigning its approach to search results. "Google is currently re-thinking their search stack from the ground-up with LLM taking a more prominent role. They are thinking about how fundamental components of search (ranking, retrieval, displaying SERP) can be reimagined given the availability of LLMs," stated Pandu Nayak during his January testimony.
This restructuring represents a dramatic evolution from Google's traditional approach to search, which has historically relied on various signals combined into a single score to determine document ranking. Google has gradually integrated machine learning into its search algorithms, beginning with the incorporation of BERT-based DeepRank ML models, then moving to RankEmbed, and now toward even deeper LLM integration.
Get the PPC Land newsletter ✉️ for more like this
Technical details of Google's current search structure
The court documents shed light on Google's current search architecture, revealing several key technical components:
- Signal processing: Google utilizes over 100 raw signals to determine search rankings. Some signals are developed using machine learning models, while others are traditional signals. These include:
- Q* (pronounced "Q star"), Google's measure of quality of a document
- Navboost, a measure tracking user engagement by location and device type, using data from the most recent 13 months
- RankEmbed, one of Google's primary LLM-trained signals
- PageRank, one of Google's original ranking signals that continues to be incorporated into quality signals
- RankEmbed technology: RankEmbed is a dual encoder model that embeds both query and document into embedding space, considering semantic properties of queries and documents alongside other signals. Retrieval and ranking are based on dot product (distance measure in the embedding space). While fast and high-quality for common queries, RankEmbed can perform poorly for tail queries. Google trained RankEmbed on a single month of search data.
- Data collection: Nayak testified that even "hundreds of query/result combinations would allow for an approximation of certain Google signals," potentially enabling competitors to begin recreating aspects of Google search. He noted that Google has been trending toward using less data for ML models (90 days, 60 days, etc.) while still prioritizing product quality.
Get the PPC Land newsletter ✉️ for more like this
LLMs and future search capabilities
The documents indicate that Google is exploring how LLMs can enhance multiple portions of its search technology:
- Query interpretation and understanding user intent
- Summarizing and presenting results more effectively
- Potential reimagining of the fundamental search experience, including ranking, retrieval, and SERP display
One significant consideration mentioned in the testimony is "computation time of LLMs, depending on the use case," suggesting that performance optimization remains a challenge for widespread LLM implementation in search.
Get the PPC Land newsletter ✉️ for more like this
Impact on user query patterns
The shift toward LLM-based search coincides with changing user behavior. According to recent industry data through May 2024, search query patterns have been evolving significantly since the introduction of consumer-facing LLMs in late 2022.
Users are increasingly employing more complex, conversational search queries rather than simple keyword searches, expecting search engines to understand nuanced questions and provide comprehensive answers - a trend that Google's LLM integration seems positioned to address.
Get the PPC Land newsletter ✉️ for more like this
Market implications
For marketing professionals, Google's restructuring of its search technology presents both challenges and opportunities:
The core change in Google's approach may significantly alter how websites are ranked and discovered. Marketers will need to adapt content strategies to align with how LLMs interpret and rank information. The focus may shift from traditional keyword optimization toward comprehensive, authoritative content that addresses user needs in context.
Traditional SEO tactics may need substantial revision as signals like RankEmbed place greater emphasis on semantic understanding rather than keyword matching. Content depth, accuracy, and relevance may become even more critical ranking factors.
Organizations that gather and analyze search data for competitive intelligence may need to revise their methodologies, as the signals that determine rankings evolve beyond traditional metrics into more complex LLM-based evaluations.
Get the PPC Land newsletter ✉️ for more like this
Google's historical approach to search evolution
The court documents reveal Google's methodical approach to search innovation. Pandu Nayak testified that Google's traditional approach to ranking began with the Okapi BM25 ranking function, which estimated document relevance to search queries. The company subsequently transitioned toward machine learning with RankEmbed, RankBrain, and DeepRank technologies.
Nayak noted that Google "avoids simply 'predicting clicks' because clicks are easily manipulated and are a poor proxy for enhancing user experience." Instead, Google aims to combine traditional signals with "predicted" signals from ML models to deliver improved search outcomes.
The documents also revealed Google's proactive approach to search quality control. When faced with problematic results—such as Holocaust denial content appearing prominently for related queries—Google engineers developed new signals to prioritize reliable, quality results.
Timeline of Google's search transformation
- Pre-2023: Google's traditional approach to ranking used the Okapi BM25 ranking function alongside PageRank
- January 31, 2025: Pandu Nayak testifies about Google's plans to reimagine search with LLMs
- May 2, 2025: DOJ publishes court documents revealing Google's search restructuring plans
- May 13, 2025: Industry specialists begin analyzing the implications of Google's LLM-based search redesign
- May 16, 2025: Search Engine Roundtable publishes details about Google "rethinking their search stack from the ground-up with LLM taking a more prominent role"
- May 18, 2025: Further analysis confirms the fundamental nature of Google's search architecture changes