Why Google removed the parameter (&num=100 )
Yes, Google removed the parameter in September 2025. This update was made without an official public announcement and has mainly affected SEO professionals and AI data scrapers, who used the parameter to request 100 search results per page. The default maximum of 10 results per page remains in effect.
Image Source: https://cdn.prod.website-files.com/657933cc3a1bbf4d8e549502/68d4e783e311467a1bfd4204_google-num100-google-search-console.png
Why Google Removed the Parameter
Although the removal was unannounced, experts suggest several possible reasons:
Preventing automated scraping: The parameter was a key method for SEO tools and AI services to efficiently scrape large amounts of search results. By removing it, Google has made large-scale scraping more difficult and resource-consuming, protecting its infrastructure from heavy automated loads.
- Improving data accuracy: The parameter caused inflated impression data in Google Search Console because it enabled bots to simulate more deep-page results than a human user would typically see. Its removal now provides more accurate reporting that reflects actual user behavior, as people rarely click beyond the first page of results.
- Encouraging the use of official tools: Google likely aims to direct developers and businesses toward its official APIs, such as the Custom Search API, for gathering bulk search data.
How the change has impacted SEO and AIacted SEO and AI
The removal has sent a ripple through the SEO and AI industries:
- Higher costs for data: Services that rely on scraping Google's search results now need to send 10 times as many requests to gather the same amount of data. This has increased their operational costs, and many have already announced new pricing models.
- Skewed reporting metrics: Many SEO platforms reported sudden drops in impression data and keyword visibility for the websites they monitor. This does not necessarily mean a loss of traffic but rather a recalibration of metrics that no longer include deep-page ranking data.
- Shifted focus in strategy: Many SEO experts are now emphasizing the importance of ranking in the top 10-20 results, as visibility beyond that is now harder to track and is less likely to generate actual user traffic.
- Disrupted AI model training: Large language models (LLMs) that rely on scraping Google for real-time information have had their access to "long-tail" results significantly curtailed, forcing them to adapt their data-gathering methods.
What is the workaround?
There is no official or secret workaround to restore the
&num=100 parameter. For the average user, the default of 10 results per page is now the standard, and they must click through pages manually. For businesses that require more results, the primary methods for gathering bulk data are now official APIs and rank-tracking tools that have been updated to adapt to the new limitations. Why is this change significant
The removal of
&num=100 has major implications for the SEO and digital marketing industry, as well as AI models that rely on web scraping. - Higher costs for SEO tools: Rank-tracking tools, like Semrush and Ahrefs, rely on the
&num=100parameter to efficiently collect search data. They now have to perform 10 separate requests to get the top 100 results, which significantly increases their infrastructure costs and slows down their data collection. - Skewed reporting data: Websites that previously tracked their keyword rankings beyond the first page saw a sudden drop in impression counts and keyword visibility in Google Search Console reports. This is because SEO tools can no longer efficiently sample results deep within the search engine results pages (SERPs).
- Impact on AI models: Large language models (LLMs) and other AI systems that scrape Google for training data have also been affected. The inability to pull 100 results at once has made it 90% less efficient to access the "long tail" of the internet, potentially reducing the diversity of information these models can quickly access.
- Less data for users and researchers: Anyone doing manual competitive analysis or large-scale research that involves scraping the top 100 results now has to manually navigate through multiple pages, or find a different, likely more expensive, data source.
What Google gained from this change
The move appears to be an effort by Google to limit scraping and get more accurate data on real user behavior.
- Reduces scraping: By making it more difficult and expensive to scrape search results at scale, Google reduces the load on its servers and gains more control over its data.
- Improves data accuracy: The removal forces marketers to rely on more realistic metrics that reflect how real people use search. Since most users rarely click past the first page, the previous data on deep-page rankings was often not a good indicator of actual traffic.
Reference: Google Deep AI Overview
Anand Samudra
Comments
Post a Comment