Google SERPs Capped at 10 Results: What Changed, What Broke, and How to Adapt

by Sep 24, 2025SEO News0 comments

basic SEO plan

What changed, in plain terms:

  • Google still shows many results for a query. Page 2, 3, 4 still exist.
  • Each results page now shows 10 standard organic listings (the classic blue links). To see result #11 and beyond, you scroll or go to the next page.
  • For years, SEOs could add &num=100 to load 100 results on one page. In mid-September 2025, Google removed that option.
  • Ads and rich features (snippets, People Also Ask, map results) are separate from those 10 organic listings and can appear above or between them.

Why this matters:
With Google SERPs capped at 10 results, rank trackers that pulled the top 100 in one load must fetch 10 pages to see the same depth. That raised cost and cut how deep or how often tools can track. Many sites then saw impressions drop, average position rise, and clicks stay flat in Google Search Console. This is a measurement shift from losing easy visibility into deeper ranks, not a sudden loss of traffic.

basic SEO plan

What Changed and When?

In mid-September 2025, Google removed the &num=100 setting that let people load 100 organic results on a single results page. Google still shows more than ten results for a query, but each page now contains only 10 standard organic listings. To see result #11 and beyond, you scroll or move to the next page. Ads and rich features—like Top Stories, videos, images, People Also Ask, and the Map Pack—can still appear above or between those ten listings.

This change rolled out across desktop and mobile. Within days, SEOs noticed that positions beyond the top ten were harder to check using SEO tools and that updates for deeper ranks became uneven. Many dashboards also showed a pattern in Google Search Console: impressions dropped, average position looked better, and clicks stayed about the same. That pattern reflects a shift in what can be measured at depth, not a sudden loss of traffic.

Why Rank Tracking Broke

Rank trackers were built around a simple method: add &num=100 to the search URL and collect the top 100 results in one load. With Google SERPs capped at 10 results, that shortcut is gone. To see the same depth, a tool now has to load ten separate pages for each keyword, location, and device. One request became ten. At small scale this is fine; at the scale of millions of keywords, cost, speed, and reliability take a hit.

More requests mean more chances to hit rate limits and CAPTCHAs. Vendors must choose between tracking fewer keywords, checking less often, or only showing the top 10–20. The result is thinner data beyond page one. Early wins in positions 20–50 are harder to spot. Cannibalization is harder to confirm without that mid-pack view. Trend lines look jumpy because updates cover fewer ranks and arrive less often. Nothing in the core of SEO changed, but the tools that show movement past the first page now face tougher limits.

Data Shifts in GSC and Dashboards

After Google SERPs capped at 10 results, many teams opened Google Search Console to a strange mix of signals: impressions fell, average position looked better, and clicks barely moved. This is not a traffic crash. It is a measurement shift.

Here’s why it happens. When tools and systems lose easy access to positions 11–100, they record fewer views of your URLs at depth. Those lost views pull impressions down. With fewer deep impressions counted, the math that drives “average position” leans toward your higher spots, so that metric rises. Clicks, however, come mostly from page one. Since your page-one performance did not change overnight, clicks stay flat.

To report cleanly, set a clear line in your dashboards at mid-September 2025. Treat the weeks before and after as two eras. Compare like with like inside each era, not across the line. For KPIs, lean more on clicks, CTR, and conversions. Use position trends with care, and use them mainly for page-one targets.

Expect more noise in long-tail tracking. Early movement in the 20–50 range will surface less often, so week-to-week lines may wobble. Pad reviews with context: note rich results, snippets, and packs that can steal attention even when a URL “ranks.” In short, read rank with the layout in mind, and judge success by the actions that matter.

Why SEO Just Got Harder

Google still shows results past page one. Users can scroll or click to page 2, 3, and beyond. The change is about how SEO tools collect data. With Google SERPs capped at 10 results per page, SEO tools can no longer pull the top 100 results in one load using &num=100. To get 100 positions, SEO tools must fetch ten pages instead of one.

That shift makes tracking deeper ranks slower, costlier, and less complete. Many SEO tools will reduce depth, reduce frequency, or both. The loss hits planning more than users: early movement in positions 20–50, cannibal pages, and long-tail finds were easy to spot when SEO tools grabbed 100 results at once. Now those signals arrive less often, or not at all, unless SEO tools invest in far more requests.

For teams, this raises the bar. With thinner depth from SEO tools, you have fewer clues to guide which pages to back. Page-one targets matter most, and proof of movement relies more on clicks, CTR, and conversions than on position data past the top ten. Core SEO work—intent fit, topic depth, internal links, and links from trusted sites—still wins. But without dense mid-pack data from SEO tools, prioritization and reporting get harder.

Local Search Impact and Edge Cases

Local users still see more than ten results. The squeeze is in the data that SEO tools can collect. When SEO tools could pull 100 results at once, local teams watched positions 20–50 to see if city pages were climbing, to spot overlaps between near-by locations, and to time refresh work. With Google SERPs capped at 10 results per page, SEO tools must load ten pages to see the same depth. Many will not do that for every query, device, and location, so deeper local data turns patchy.

This affects how you measure, not how users search. Ads, the 3-Pack, and rich blocks still shape the page-one view. If a service page sits at #12, users can still reach it—yet SEO tools may not show that rank on every check. Plan for gaps. Use Google Business Profile data, call tracking, and UTM links to read demand when SEO tools miss deeper ranks. Track Local Finder activity, reviews, and actions on profiles to fill the hole left by thinner rank rows.

What to focus on while SEO tools adapt:

  • Strong Google Business Profile: right categories, services, photos, Q&A, and steady reviews.

  • Clear local entities: unique location pages, consistent NAP, and schema that ties each location to the brand.

  • On-page clarity: service + city in titles, H1s, and copy, with proof of work in that area.

In short, users can still go past page one. The pain sits with SEO tools that lost the one-load, 100-result view. Local teams should expect less depth from SEO tools and rely more on page-one gains, profile metrics, and on-site outcomes.

E-commerce and Content Sites Under the Cap

Users can still reach page 2 and beyond. The shift harms the data that SEO tools deliver, not the number of results users can view. For e-commerce and content sites, that loss of easy depth affects how you spot winners and fix gaps. When Google SERPs capped at 10 results per page, SEO tools lost the one-load, 100-result pull. To see positions 11–100, SEO tools must crawl ten pages. Many won’t do that for every keyword at the same pace, so mid-pack insight thins out.

For e-commerce, this muddies category vs. product decisions. Movement at positions 20–50 often showed when a category page was close to overtaking a product page, or when filters and facets caused cannibalization. With less mid-pack data from SEO tools, those signals arrive late, and weak PDPs or duplicate facets linger longer.

For publishers and SaaS blogs, the same loss slows content tuning. The climb from position 35 to 18 used to flag a post worth a refresh or a link push. Without steady depth from SEO tools, you get fewer of those “almost there” alerts. Teams risk over-publishing new posts while near-winners stall just outside page one.

The takeaway is not that users can’t find deeper results; they can. The takeaway is that SEO tools now deliver less frequent and less complete views of those deeper ranks. E-commerce and content teams will need stronger on-site signals—conversions, assisted revenue, scroll, and engagement—to choose which pages to back when SEO tools don’t surface the mid-pack climbs.

AI, Assistants, and Visibility Loops

Users can still browse past the first ten results. The shift limits how SEO tools and AI services collect SERP data at scale. When SEO tools lost the one-load, 100-result pull, it also became harder for third-party SERP APIs and AI assistants to scan long SERP lists in one request. In practice, many assistants now lean even more on sources that sit in the top ten because fetching deeper results costs more and fails more often.

That creates a loop. If AI answers, summaries, and “overview” panels prefer page-one sources, brands in the top ten gain more reach beyond Google itself. The opposite is also true: if your page hovers at #18, users can still find it, but SEO tools and AI layers may cite it less because they see it less. This isn’t a ranking change—it’s a data access change—but it nudges attention toward page-one entities.

For teams, that means two things. First, invest in page-one intent fit and clear entity signals so your pages are strong candidates for snippets and citations. Second, track assistant and aggregator mentions where you can, while accepting that SEO tools will surface fewer deep-rank touchpoints than before.

Measure What Matters in a 10-result World

The user view did not shrink. The data stream from SEO tools did. With Google SERPs capped at 10 results, SEO tools give thinner depth beyond page one, so shift your tracking mix.

Start with actions, not ranks. Clicks, CTR, sign-ups, leads, and revenue tell you if a page works. Use rank as a support signal for page-one terms, not as a proxy for success across the board.

Read layout, not position alone. A “#3” below a snippet, PAA, and a video block can sit far down the screen. Ask SEO tools for pixel depth or feature flags where they exist, and add manual checks for key terms.

Split reporting by era. Mark mid-September 2025 as a line in the sand. Compare pre-change to pre-change, post-change to post-change. Do not stack them.

Segment your queries. Branded, informational, commercial, and local terms behave in different ways. If SEO tools lack depth for a segment, plug gaps with site data, GBP insights, and logs.

Watch cohorts, not single points. Track a set of pages tied to one topic hub. If SEO tools miss a few deep ranks, the hub trend still shows if the cluster moves.

When depth is missing, use proxies. Track SERP features won, snippet holds, sitelinks, and internal link clicks to hub pages. These signals help you pick winners when SEO tools cannot show steady ranks past the top ten.

Playbook: the Next 90 Days

  1. Set a clean line in your reporting. Mark mid-September 2025 in every dashboard. Treat earlier data as a different era. Update client notes so the team knows why impressions fell while clicks held. Where SEO tools lack depth, lean on clicks, CTR, leads, and revenue.
  2. Tighten your page-one bets. Pick core queries and pages that can reach the first screen. Refresh titles, intros, and meta so intent is clear. Add missing sections that raise information gain. Trim fluff. When two pages chase the same term, merge the weaker one and redirect. Without steady mid-pack ranks from SEO tools, waste hurts more.
  3. Strengthen topic hubs. Build a cluster that covers the subject with depth, not a pile of near-duplicates. Use clear internal links from posts to the hub and back. Use anchors that say what the target covers. This helps users and helps Google choose the right page out of your ten, not the wrong one.
  4. Fix crawl paths and page roles. Map the route from your home page to each hub and key page. Cut orphan paths. Set one page to lead for each main query. With Google SERPs capped at 10 results, mixed signals slow gains, and SEO tools will not flag cannibal pairs as fast as before.
  5. Add the right markup. Use schema for articles, products, FAQs, events, and local business. Make your org, authors, and locations clear. Mark up ratings and offers where they fit the rules. Rich results can lift CTR even when rank holds steady.
  6. Raise proof of trust. Show sources, author bios, dates, and update logs on content that guides money or health. Add real examples, data, and screenshots. When SEO tools cannot surface the slow climb from position 30 to 18, strong proof helps push a page into the top group faster.
  7. Close link gaps that matter. Compare the top ten for your key terms. Find the links that tie to your topic, not any link at scale. Earn a few that fit. Link from your own related pages with anchors that match the target’s role.
  8. Rebuild tracking where you can. Ask your SEO tools for top-20 pulls on priority terms, even if the full top-100 is gone. Add manual checks for a short list of head terms. Use GBP insights, call tracking, UTMs, log files, and on-site search to fill holes left by thin rank rows.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Presets Color

Primary
Secondary