Adapting to Google’s &num=100 Update: An SEO Specialist’s Story
- Tim Mueller

- Oct 19
- 11 min read

A Morning of SEO Chaos
I remember the morning it all started. Logging into my analytics and rank tracker, I was shocked – keyword impressions had plummeted and many rankings beyond page one simply vanished from my reports. It looked like an SEO disaster. At first, I feared a Google algorithm update had torpedoed my hard-earned rankings. But the traffic and conversions on our site were steady. Something was off in the data, not necessarily in reality. This puzzling situation sent me digging for answers.
The Mystery of the Missing Results
By midday, I found the culprit: Google had quietly disabled the &num=100 URL parameter in mid-September 2025. For years, this hidden setting allowed SEOs and tools to load 100 search results on one page, instead of the usual 10 per page. I learned that using &num=100 was a common trick to retrieve the Top 100 Google results in a single request. Suddenly, that trick stopped working – Google would show only the first 10 results (page 1, maybe page 2) and no more. To see the top 100 now, tools (and I) would have to fetch 10 pages of results (10 results per page) manually or via multiple queries. No wonder my rank tracker was blind beyond the first page – a silent change broke the way it gathered data, causing disruptions across many SEO tools overnight.
I was relieved to discover my actual rankings hadn’t tanked; it was Google who “killed” the 100-results view, impacting how data is collected rather than how sites rank. In other words, rankings didn’t change – our measurement tools broke. Still, I needed to understand why Google would do this and what to do next.
Why Google Killed the 100-Results Page
Google didn’t provide an official explanation, but the SEO community floated several plausible reasons for axing the 100-results option:
Curbing Scrapers & Reducing Load: The 100-results setting was heavily used by bots and SEO scrapers rather than real users. Google likely saw this as an efficiency issue – by capping results at 10 per page, scrapers now have to make many more requests, reducing strain on Google’s servers and discouraging easy bulk data grabs.
Cleaning Up Data Accuracy: Those bots loading pages 2–10 were inflating metrics. SEO trackers using &num=100 generated artificial impressions in Google Search Console by “seeing” results that real users rarely scroll to. Removing the 100-results view instantly cut out those phantom impressions, making reported data more reflective of actual user behavior.
Aligning with User Experience: Hardly anyone clicks through 10 pages of results anymore. Google has moved toward continuous scroll on desktop and mobile, loading more results only as needed. Enabling only 10 results at a time matches modern UX and ensures metrics focus on what users actually see. This change also makes it harder for AI scrapers to mine deep results in one go, which fits Google’s broader strategy of controlling data access.
In short, Google’s change was about efficiency and realism – cutting out the noise of bot-driven data and reflecting the fact that page 1 is where nearly all the action happens. As an SEO specialist, I grudgingly understood the rationale. But it left me with a new challenge: how to adjust my work now that the easy window into the top 100 SERP had closed.
The Impact: Data Gaps and Shaken Metrics
Once I knew what happened, I could make sense of the wild swings in my reports. I wasn’t alone – the removal of &num=100 caused sharp drops in Search Console impressions and reported keyword rankings across the industry. One analysis found that 87.7% of sites saw their impressions decline in Google Search Console, and 77.6% of sites lost many of their “unique ranking terms” being reported. In other words, overnight it looked like our site was ranking for far fewer keywords, simply because anything beyond 10th position stopped showing up in the data. Short and mid-tail keywords (usually higher-volume terms that often ranked on page 3+) saw the biggest visibility drop in reports.
On the flip side, my average position metric suddenly improved dramatically – not because we actually jumped in rankings, but because all those page-3 and page-9 rankings weren’t counted anymore, mathematically skewing the average higher. It was a strange sight: impressions way down, average position way up. If I hadn’t known better, I’d think our SEO was simultaneously failing and magically improving! In reality, clicks and traffic stayed about the same, since real users were mostly clicking page-1 results all along. This was confirmed by others too – many noted that click trends remained stable, indicating the change was in measurement, not in user behavior.
My rank tracking software also struggled. Initially, it simply stopped showing any rankings past #10 for a lot of keywords. Like many SEO tools, it had relied on the one-query method to grab the top 100. With that gone, some tools temporarily showed only Top 10 or Top 20 results, missing everything on page 3 and beyond. The more I read industry chatter, the clearer it became: our tools needed major adjustments. Some rank trackers had data gaps and slower updates as they scrambled to make 10× more requests per keyword to gather the same data. It was as if the entire SEO tracking world hit a speed bump and had to downshift.
Rethinking My SEO Strategy
Realizing this was a “measurement shock, not a rankings drop”, I shifted my approach to both tracking and reporting. The first step was communicating these changes to my team and clients. I added annotations in Google Analytics and Search Console for mid-September 2025, explaining that a drop in impressions after that date doesn’t equal a drop in actual performance – it’s due to Google’s change in how data is reported. I found myself repeatedly saying, “Don’t panic, our buyers aren’t gone – the bots are.”
Going forward, I decided to focus our KPIs on what truly matters: clicks, conversions, and page-1 rankings. Impressions are a useful trend indicator, but now that a chunk of them vanished from reports, they’re less reliable for bragging rights. I encouraged our team to look more at metrics like click-through rate and the number of keywords in the Top 10, rather than obsessing over total impressions or average position which were now skewed. After all, a page-1 ranking that actually brings traffic is worth more than ten page-5 rankings that never get seen.
At the same time, I didn’t want to ignore everything beyond page 1. This update was a wake-up call to prioritize quality over quantity in our SEO efforts (just as Google intended). I started reviewing our keyword list and trimming out truly irrelevant “vanity” keywords that we had been tracking only because we could. With limited visibility into the long tail now, it made sense to concentrate on the keywords that align with our content and business goals – especially those on page 2 that, with some TLC, could creep onto page 1 and actually deliver traffic.
Crucially, I had to educate stakeholders that our reporting baseline had changed. Mid-September became a hard line in our charts – comparing before and after is apples to oranges. We reset expectations by setting new benchmarks for what good performance looks like under this new reality. For example, if a stakeholder saw that our “number of ranking keywords” dropped from 500 to 200, I’d explain that the 300 drop doesn’t mean we lost rankings, only that we’re no longer counting all those page-3 placements that never brought visitors anyway.
Why Tracking the Full Top 100 Still Matters
One big question lingered for me: Should I even bother tracking rankings past page 1 anymore? Some experts argued that positions 50–100 rarely bring meaningful traffic or conversions. I even saw suggestions that this change was a good opportunity to stop chasing “vanity metrics” and focus only on page-1 results. There’s truth to that – most clicks happen in the top 10 results, so one could save effort (and tool costs) by watching just the first page. But as an SEO specialist, I know that what’s invisible today could be tomorrow’s victory. From my perspective, tracking the full Top 100 still provides valuable insights for digital marketing. Here’s why:
Complete Keyword Visibility: Tracking deeper rankings gives a fuller picture of our search presence. It lets us see all the queries we appear for, not just the ones we win page 1 on. In fact, when Google removed deep results, sites on average reported 77% fewer ranking keywords in Search Console. That shows how much of our keyword footprint was hiding beyond page 1. For measuring brand visibility or SEO reach, it’s important to know if we rank #12, #35 or #82 for relevant terms – even if those don’t yet bring traffic. Broad visibility metrics (like share-of-voice or total ranking keywords) consider positions beyond page one and give context on our overall SEO performance.
“Striking Distance” Opportunities: Page-2 rankings (positions ~11–20) and even page-3 rankings can be seen as opportunities in waiting. If I’m unaware we sit at #15 for a high-value keyword, I might miss the chance to give it a push. By tracking beyond the top 10, I can identify these striking-distance keywords and target them with optimizations (better content, more links) to potentially bump them onto page 1. Incremental improvements from #60 to #30 or #15 to #9 won’t show up if you aren’t tracking those positions. Seeing that upward movement is valuable – it validates that our SEO actions are moving the needle, even before the big traffic jumps hit.
Competitive Insights Beyond Page 1: SEO isn’t just about our site; it’s also about who we’re up against. By keeping an eye on the top 100, I can spot competitors lurking in the wings. Maybe a rival brand is climbing fast from page 4 to page 2 for a key term – I want to know that! Comprehensive rank tracking lets me see who ranks in positions 20, 50, or 90 for my target keywords, which can reveal new competitors or industry players I hadn’t considered. These insights help shape our strategy (for example, if I see a competitor’s blog posts populating page 2, maybe we need to up our content game). One specialized rank tracker noted that having the full top-100 data enables users to watch those entrants in the lower spots that might climb upwards, gaining a competitive edge by reacting early.
In short, page-1 is the goal, but page-2 and beyond is the farm team – that’s where future page-1 rankings often come from. Tracking the whole field ensures I don’t miss out on measuring our progress or spotting threats and opportunities. It contributes to more holistic digital marketing metrics, from calculating realistic share of visibility to guiding content strategy with long-tail keyword insights.
Tools That Kept the Top 100 Alive
With the importance of deep tracking in mind, my next challenge was practical: How do I regain visibility into positions 11–100 now that Google shut the door? My existing rank tracker’s default behavior was limited after the change, and I needed a reliable way to see beyond page 1. Thankfully, I discovered that some SEO platforms adapted quickly – or had already been prepared – to keep delivering full top-100 results. After some research (and a few trials), I identified a few key tools that still show the whole SERP for my keywords:
SE Ranking: This all-in-one SEO platform stood out for responding immediately. When Google removed &num=100, SE Ranking’s team reacted within days to adjust their system. As a result, SE Ranking never stopped showing the full Top 100 results for tracked keywords – it fetches page 2, 3, and so on in the background and compiles the Top 100 positions as before. If my keyword is ranking #85, SE Ranking will still show it, ensuring I don’t suddenly lose that data. They’ve openly touted that while many tools gave up on deep results, they did not, even absorbing the extra cost of those 10× queries so customers wouldn’t lose visibility. This was a breath of fresh air for me, as I could continue to get daily updates on all my rankings 1 through 100 without gaps.
Semrush: As one of the industry-leading SEO suites, Semrush also navigated the change effectively. They didn’t make a huge fuss publicly, but I noticed (and later confirmed) that Semrush’s Position Tracking tool continued to provide data up to the Top 100 results per keyword despite Google’s update. In other words, if I rank #57 or #89 for a term, Semrush still reports that position in its interface. It appears they achieved this by implementing multiple page fetches on the backend. All Semrush plans, even the standard ones, retained deep ranking info – they included Top 100 tracking as part of the service with no extra charge for it. This meant for existing Semrush users, things “just worked,” though the company has been cautious, noting they’ll monitor sustainability if Google further tightens scraping.
Ahrefs: Ahrefs’ rank tracking initially hit a hard limit when &num=100 was killed. For a short time, many Ahrefs users saw only Top 10 results, as their data partner could no longer pull beyond page 1. The Ahrefs team acknowledged the issue and promised a solution – and by early October 2025, they delivered, but with a catch. Full Top 100 tracking in Ahrefs became available only to Enterprise customers at first. They introduced an advanced (resource-intensive) workaround to get deeper results, reserving it for their highest tier while they evaluate expanding it. So as of late 2025, if you’re on a standard Ahrefs plan, you might still be capped at Top 10 or 20 in the interface, whereas Enterprise plan users can again see rankings all the way to 100. This move underscores the increased cost of deep tracking – Ahrefs had to roll it out in a premium way due to the heavy resources involved. They have indicated they’re looking into scaling it for all users, but there’s no timeline on that yet.
These weren’t the only platforms adjusting (shout out to others like Pro Rank Tracker and Advanced Web Ranking, which also found creative solutions), but they were the three that I focused on, since they’re popular and relevant to my work. Knowing that tools like SE Ranking, Semrush, and Ahrefs found ways to still provide Top 100 data gave me confidence that I could piece my tracking regimen back together.
In my case, I decided to give SE Ranking a try for our day-to-day rank tracking, since its affordable plans and immediate adaptation were hard to resist. The first time I ran a report and saw all our keywords from positions 1 through 100 listed just like before, I literally sighed in relief. It felt like getting x-ray glasses for the SERPs – the transparency was back. Now, I use Semrush for a lot of research tasks and Ahrefs for link analysis, but SE Ranking has become my go-to for monitoring rankings because I know it’s diligently capturing everything, page 1 to page 10. This combination ensures that despite Google’s changes, I have the data to keep both my clients and myself informed and confident.
Conclusion: Embracing the New SERP Reality
This whole episode was a whirlwind lesson in the ever-evolving nature of SEO. Google’s &num=100 update reminded me that our data sources can change overnight, and we have to adapt just as quickly. In the end, it wasn’t a story of doom – it was a story of adjusting to a new reality. I learned to stress quality metrics over quantity metrics, to educate others (and myself) on why the numbers looked different, and to lean on the right tools that evolve with the times.
As an SEO specialist, it’s my job to read between the lines (or in this case, between the SERPs). Now, armed with a retooled tracking setup and a clearer understanding of what really matters, I’ve rebuilt my workflows for this new normal. We’re focusing on making page-one content that earns clicks, using deep rankings data smartly for insights rather than vanity, and keeping an eye on the horizon for whatever Google changes next. If nothing else, the &num=100 saga reinforced an old truism: in SEO, change is the only constant – but with knowledge and adaptability, we can not only survive the changes, but come out smarter on the other side.








Adapting to Google’s num=100 update is tough, but we’ll get used to moves like this. AI SEO is never stable there are new changes and updates every day.
Really enjoyed this one — feels super close to 2026 SEO reality. I also noticed weird drops in impressions right after the num=100 change and was going crazy thinking rankings tanked. Glad you pointed out it’s mostly a tracking issue, not an actual SEO crash.