Site hierarchy is the parent-child structure of your website — which pages sit above which, and how many clicks separate any page from the homepage. It is the single most important structural decision you make, because it determines how much crawl priority Google gives your pages and how easily customers can reach your commercial content.
Every additional click between your homepage and a product or service page has a cost. Visitors abandon the path. Google crawls less frequently. Link equity dilutes. The result is pages that are harder to find both in navigation and in search results — and pages that are harder to find are pages that do not sell.
This spoke is part of the website structure and architecture guide, which covers all aspects of how your site is organised and why architecture is the foundation of SEO performance.
Crawl depth and revenue are connected through two mechanisms: Google's crawl behaviour and visitor drop-off.
Google's crawl behaviour. Googlebot allocates a finite amount of crawl time to your site based on your domain authority and server capacity. It follows links to discover pages, starting from your homepage. The deeper a page sits in your hierarchy, the longer it takes Googlebot to reach it — and the less crawl budget it receives. Pages at depth 1 or 2 are crawled frequently, often daily. Pages at depth 4 or 5 may be crawled weekly at best. If you update a product description, change a price, or add new content to a deep page, those changes may take weeks to appear in search results. For a business running promotions or managing stock, that lag has a direct commercial cost.
Visitor drop-off. Users are not infinitely patient. Every click a visitor must make to find what they want is an opportunity for them to leave. Studies consistently show that conversion rates decrease with increasing navigation depth. A visitor who finds a product in two clicks is significantly more likely to purchase than one who needs five clicks. This is not a marginal effect — depth is one of the most controllable levers affecting your site's conversion rate.
The distinction between flat and deep architecture is not about the total number of pages — it is about how those pages are organised relative to the homepage.
| Architecture Type | Depth | Example Path | Effect |
|---|---|---|---|
| Flat (ideal) | 2-3 clicks | Homepage > Category > Product | Frequent crawling, high link equity |
| Moderate | 3-4 clicks | Homepage > Section > Category > Product | Acceptable for most commercial pages |
| Deep (problematic) | 5+ clicks | Homepage > Section > Category > Sub-cat > Filter > Product | Infrequent crawling, diluted authority |
| Orphan | N/A — no path | Page exists but nothing links to it | May never be found by Google or visitors |
A flat architecture is achievable even for large sites. A retailer with 10,000 products does not need to list every product on the homepage — it needs to ensure that every product is reachable via Homepage > Category > Product, keeping maximum depth at 3. The categories do the organisational work without adding excessive depth.
A deep architecture typically develops organically as businesses add more pages without strategic planning. A site that starts flat can become deep over years of growth if new categories are added as sub-subcategories rather than reorganising existing structure.
Before making structural changes, measure what you have. Screaming Frog SEO Spider is the standard tool for this and is free for sites up to 500 URLs.
Step 1: Download and install Screaming Frog.
Step 2: Enter your homepage URL and click Start. Let the crawl complete.
Step 3: Go to the Internal tab. In the filter dropdown, select HTML pages.
Step 4: Look at the Crawl Depth column. This shows the click distance from the homepage to each page.
Step 5: Sort by Crawl Depth (descending). Any page at depth 4 or higher should be reviewed. Ask: is this page commercially important? If yes, it needs to be brought closer to the homepage through improved internal linking or structural reorganisation.
Step 6: Check the Inlinks column. Pages with very few inlinks (1-2) are structurally weak — they are hard for Google to find because few pages link to them. Pages with zero inlinks are orphan pages.
Flattening a deep structure does not always require changing URLs or rebuilding your navigation. The most practical approach is strategic internal linking — adding links from shallow pages to deep ones, reducing their effective crawl depth without restructuring the site.
Add important pages to navigation. If a key product category is buried three levels deep, adding it to your main navigation immediately reduces its effective depth to 1. Every page on your site links to your navigation, so navigation links are the most powerful depth-reduction mechanism available.
Add contextual links from blog and content pages. A blog post that naturally mentions a product or service can link directly to that product's page, bypassing category layers. A well-structured blog with 50 posts, each linking to relevant product pages, can effectively bring those products to depth 3 even if their URL structure puts them at depth 5.
Add featured sections to category pages. A category page that features top-selling or newly arrived products with direct links brings those products to depth 2 from the homepage, even if they also appear deeper in the hierarchy.
Consolidate overcrowded sub-categories. If you have sub-subcategories with only 2-3 products, merge them with the parent category. This reduces depth and creates stronger category pages with more products.
For the navigation side of depth reduction, the guide to website navigation design covers how to structure menus to serve both users and Google. For the internal linking side, the guide to internal linking strategy provides a systematic approach to distributing link equity through contextual links.
Crawl depth is the main lever for improving crawl budget allocation, but it is worth understanding what Google's crawl data actually shows about your site. Google Search Console provides crawl statistics that reveal how frequently different pages are being indexed, how much crawl activity your site receives, and whether Googlebot is encountering errors during crawls.
The guide to Googlebot crawl statistics in Search Console explains how to read this data and how to use it to prioritise structural improvements. If your crawl data shows that important pages are being indexed infrequently, that is confirmation that depth or internal linking issues need addressing.
Check your three most important commercial pages. Starting from your homepage, navigate to your most important product, service, or category page. Count the clicks. If it is more than three, note the path and identify where depth is added unnecessarily.
Run a Screaming Frog crawl. Even a free crawl of up to 500 URLs will give you a crawl depth report and inlinks count. Export the results and sort by depth to identify your deepest pages.
Check for orphan pages. In Screaming Frog, filter for pages with 0 inlinks. These are pages that no other page on your site links to — they may be indexed from your sitemap, but they receive no internal link equity and are invisible to visitors navigating your site.
Add one internal link from a high-traffic page to a deep commercial page. Even a single added internal link from a page with strong authority can meaningfully improve a deep page's crawl frequency and link equity.
For a full structural audit of your site, including crawl depth analysis and recommendations, see the SEO optimisation services. To discuss your specific situation, get in touch.
Crawl depth is the number of clicks required to reach a page from the homepage. A page linked directly from the homepage has a crawl depth of 1. A page linked from a category page has a depth of 2. Google prioritises pages with lower crawl depth — they are crawled more frequently and typically receive more link equity. Pages at depth 5+ are often crawled infrequently and rank poorly.
A flat architecture means most pages are accessible within 2-3 clicks from the homepage. Instead of deeply nested categories and subcategories, pages are organised in broad, shallow groups. This does not mean every page links from the homepage — it means the path from homepage to any important page is short. For a 500-page ecommerce site, a flat architecture might use 10-15 top-level categories with products 2-3 clicks deep.
Screaming Frog (free for up to 500 URLs) provides a crawl depth report. Crawl your website, then go to Internal > Crawl Depth. This shows every page and its click distance from the homepage. Pages at depth 4+ should be reviewed — if they are commercially important, restructure your internal linking to bring them closer to the homepage.
Add direct links from higher-level pages to important deep pages. This might mean adding popular products to category pages, adding internal links from blog posts to deep product pages, or adding a 'Popular Products' section to your homepage. You do not need to restructure your entire URL hierarchy — adding internal links from shallow pages to deep ones reduces effective crawl depth.
Yes. Pages closer to the homepage tend to rank better because they receive more internal link equity and are crawled more frequently. Google interprets pages closer to the homepage as more important. This is not a speculation — Google's own documentation states that internal links help Google discover and understand the relative importance of pages on your site.
Subscribe to get our latest guides, tutorials, and success stories delivered to your inbox
Breadcrumbs show visitors their position within your site hierarchy — Home > Category > Subcategory > Product. They reduce bounce rate by giving visitors an easy way to browse related products instead of leaving. They also improve how your pages appear in Google search results.
Read More →How you categorise your products determines how easily customers find them and how well Google indexes them. Too many categories create thin pages that dilute your SEO authority. Too few categories bury products in overcrowded listings. The right structure balances findability with depth.
Read More →Clean URLs are both a trust signal for customers and a ranking signal for Google. A customer who sees yoursite.com/blue-running-shoes knows what to expect. A customer who sees yoursite.com/product?id=4829&cat=12 does not. URL structure affects clicks, trust, and rankings.
Read More →