Services

Fix Next.js Pages Not Being Found Because of Sitemap, Robots or Crawlability Issues

Use this page when pages are not being found, indexing is being blocked, or the platform is publishing the wrong sitemap, robots, or discovery signals.

Fix sitemap, robots, and crawldiscovery failures before important Next.js pages stay hidden, blocked, or stale in search.

Typical symptoms

  • Important URLs are not being discovered or refreshed quickly enough.
  • Robots rules or sitemap output do not reflect the intended live estate.
  • Indexing lag persists even when pages appear technically available.

Likely causes

  • Sitemap generation or deployment is not aligned with the live route set.
  • Robots rules are too broad, too narrow, or inconsistent across environments.
  • Crawl discovery is weakened by missing links, stale outputs, or bad URL handling.

What I look at first

  • Quick check: compare the live route set with the sitemap, robots rules, and internallink outputs the site is currently publishing.
  • Whether dynamic or revalidated pages are making it into crawlable outputs.
  • How the live route set compares with what the site advertises to crawlers.

How I help fix this

  • Trace the discovery problem to the right combination of outputs and rules.
  • Tighten sitemap and robots handling around the real live routes.
  • Support rollout of fixes without introducing new crawl ambiguity.

When to bring me in

  • Bring me in when indexing lag looks more like a discovery problem than a content problem.
  • Bring me in when sitemap or robots changes need to be tied back to real route generation and deployment behaviour.

Related project experience

  1. Nando’s

    Senior software engineer on the UK and Ireland replatform, migrating Nando’s customerfacing websites from legacy Drupal to a unified headless platform built with Next.js and Storyblok, with a focus on performance, accessibility, and SEO.

    Screenshot of the Nando’s website; part of John Kavanagh's development portfolio.

Related technical articles

Related services

  1. Capability

    Technical SEO for JavaScript Applications

    Bring in engineeringled SEO help when Google is not indexing important JavaScript pages because rendering, crawlability, metadata, or migration changes are getting in the way.

  2. Adjacent scenario

    JavaScript SEO Rendering and Indexing Fix

    Diagnose why Google is not indexing important JavaScript pages before incomplete HTML, unstable metadata, or routing changes keep them out of search.

Questions teams usually ask

Is this only about stale sitemaps?
No. A stale sitemap can be one symptom, but the wider problem is usually crawl discovery: robots rules, internal links, environment leakage, or route generation that no longer matches the live estate.
Can this overlap with ISR or revalidation bugs?
Yes. If the platform relies on generated outputs staying current, crawlability issues can overlap directly with revalidation and deployment behaviour.

Tell me what you're seeing

Send me the affected page or route, point me at the code if that helps, and tell me what you expected to happen versus what is happening now. If this connects to a Next.js migration, technical SEO drop, performance issue, launch, or platform move, include that context too. I'll come back with the clearest next step.

Skip past clients

Previous Clients