Best Practices for Faceted Navigation & SEO: eCommerce Facets, Filters, and Sort Order

Updated on: 
August 30, 2022
Tory Gray
Tory Gray

Faceted navigation, as well as filtering and sorting, are present on most eCommerce website category pages to help users explore and narrow down to the products they are most interested in. They present both organic search opportunities and issues, so here we'll cover what those issues are, and how to avoid them - while also maximizing the ROI for your eComm store.

Here's what we're covering today: 

Some eCommerce navigation definitions:

Sort order helps users navigate within a set of products by ordering them in different ways - “price low to high” vs. “price high to low”, or "featured" are all classic examples.

Filters & Facets help narrow down options within a set of products. If you are viewing sports gear, and you filter to "basketball", you'll only see basketball gear; gear from other sports (football, baseball, etc.) will disappear.

Learn more about the difference between facets and filtering, but in general, most eCommerce sites utilize all three variations - and for SEO purposes, they can be treated in much the same way.

What do Faceted URLs look like?

Faceting, filtering, and sorting are typically implemented as URL parameters, but they're possible to implement via the core URL - or purely via JavaScript/Ajax too (depending on your platform.) Learn more about the various parts of URLs.

Examples of new URLs - actual unique URLs! - with facets, filters, and sort order implemented with parameters:


When these features are implemented as core URLs, they may look like this:


Note that this structure may also be a category page; as an outside user you can't really tell without digging "under the hood."

A pure JavaScript implementation won't change the URL at all.

Faceted Navigation: Good for SEO?

Like pretty much everything else in SEO, the short answer is: it depends! They can get tricky primarily in terms of the following.

What is the best SEO Strategy? SEOs: 1) It Depends. 2) It depends, but in orange.
It really depends!

Duplicate Content

How might filtered URL variations compete with each other, category pages, or individual product pages? This issue is a result of the elements getting set in the URL in the order in which the user clicks them during their session; when they are added in a different order, they create duplicate content variations.

Let’s use the table example: let’s say you have a filter for table shape (round vs. square vs. rectangle), and style (farmhouse vs. traditional vs. modern).

Is fundamentally any different from the URL How about

Nope. Same user intention, same product results. Enter: duplication issues!

Thin/Low-Quality Web Pages

What if there aren’t enough products present within a set of filters to make it a legitimately useful page? Or what if you are ran out of stock, or decided to remove a product, and you don't know if you need another solution or come up with strategies to 404 pages?

You wouldn't want a category page with only 1 or 2 products in it; similarly, you don't want to waste a search engine's time crawling & indexing a URL variation with this little inventory.

Diluted Link Equity

This is a result of the thin & duplicate content issue; if/when you have a bunch of low-value pages on your website, there's less link equity to go around for the pages that do matter.

Think of this as a bank account. Money comes into the bank (let's call this your homepage), and it's distributed to each page, starting with the ones linked in your global navigation and finally reaching all the individual blog posts and products that you've linked to on, say, page 37 of Home Goods products.  

Each page will get less money the more pages you have. If those are high-value pages driving revenue - that's awesome! If it's a low-value page, that's taking money away from the high-value pages (which could be providing *more* value if they didn't have to share so much.)

Don't overthink this analogy and remove a bunch of pages from your website arbitrarily! Rather, be thoughtful about each page you make available for search engines to provide strong, unique value for your customers & search engines.

(Best tools for duplicate/thin content AND link equity issues: retiring non-valuable pages, robots.txt disallow, meta robots noindex, canonical tags, and the Google Search Console (GSC) Remove URL tool.)

Wasted Crawl Budget

Infinite URL variation combinations (filtered and sorted and faceted pages have SO MANY different possible combinations!) can waste a lot of search engine bot crawling time; if you have a large website where crawl budget is a concern, faceted navigation can easily exacerbate that issue.

However, implemented correctly, facets & filters can dynamically create many useful landing pages for SEO.

(Best - only! - tools for Crawl Budget issues: disallowing crawling in the robots.txt file & nofollow internal links.)

How to Make Faceted Navigation Useful for Site Visitors, Search Engines

Going back to the table analogy from the Duplicate Content section above, do people search for “square dining tables”? Sure do. How about “modern dining tables”? You betcha.

Okay then: how about “modern square dining tables”? Yup - 20 searches a month, according to Ahrefs (curiously, “modern square dining tables for 8” gets 50 searches a month!)

The general takeaway here is that you should plan your navigation categories, facets, filters, and sort orders according to:

  1. What people/searchers look for (e.g. search query/keyword data)
  2. How many products you have within a category (e.g. don't add filters for features where you don't have many product offerings.)
  3. How users want to sort through and find products. These can be the same things as "what people look for", but they aren't always. For example, how often do you search for sneakers between $30-$50? Not too frequently. But that doesn't mean you don't want to filter by price to fit your budget! So make sure you think through and provide a positive user experience with these unique URLs without sacrificing your SEO.

Focus on allowing crawling and indexing for those variations and combinations for which users search the most - and solving the problems outlined above using the right tool for the job (more on this below.)

Use 3rd party keyword research tools to determine what combinations are most popular, and let the keyword opportunity data guide you into what level of depth is worth it for the extra work to build this functionality for your business.

Typically this means allowing for crawling & indexing of 1 - 2 levels of parameter (or URL) nesting. Going 3+ layers deep can get pretty complicated (exponentially so!), so we generally don’t recommend it, but if you do, be sure to work with a senior SEO and software engineer to make sure you do it right.

ecommerce shopping and seo
eCommerce Shopping & SEO

Tips & Tricks:

  1. Don't use filters or facets (or index/allow crawling on filters) that are the same as your existing category pages or subcategories - that's just asking for duplicate content!
  2. You will want custom metadata, page headers, and page content on each unique variation you want to be indexed - so make sure to remember this optimization requirement when you go to build this custom functionality!
  3. In terms of sort order, we typically recommend not indexing any of them, assuming sort isn’t added to the URL by default. E.g. index, but not
  4. Pagination and any filter, facet, or sort order are generally not recommended for indexation. Do make sure a core pagination path is available so Googlebot can find all your products, of course - but page 4 of red sneakers in size 10 is generally not a useful page to drive organic search traffic to.
  5. Always test crawling your website as the search engine crawler of your choice (e.g. Googlebot) - double-check what is and isn't accessible and indexable - and ensure it fits with your preferred expectations.
  6. Leverage custom landing pages for any specific combinations of facets and filters that you can't easily index & optimize without creating more site complexity - especially for long-tail keywords.
  7. For faceted search, we generally recommend disallowing crawling and indexing of it. In almost all cases, search pages should not be crawled or indexed, period.

How To Use Crawling/Indexing Tools for Faceted Navigation

We've written in-depth about SEO crawling and indexing issues, as well as how to deindex various file & page types. Here are some notes on handling this specifically for Faceted Navigation.

Robots.txt Disallow

Disallow crawlable URLs for facets, filters, and sort orders under your specified limit.

Note 1: this is the only real way to control crawl budget issues (outside of internal linking, but that's a bigger subject), but if you have any links to any URL versions you've disallowed crawling on, you won't get credit for those links. So generally, use this tool a) only if/when you have crawl budget issues (e.g. it's a very large website), or b) only for "deep" variations where you don't really have any valuable links anyway (ideally BOTH.)

Note 2: Disallowing doesn't mean the URLs can't be indexed! (FYI)

Including actual examples here is complicated, since URL patterns for eComm sites vary GREATLY, but here are some common ones:

  • Disallow: *?sort=*
  • Disallow: *&sort=*
  • Disallow: /collections/*%2b*
  • Disallow: /*/collections/*+*

Learn more about robots.txt files.

Meta Robots Noindex

Noindex tags are page-level instructions for what search engines can and can't index. It's the only really detailed way to control indexation, but they can still be crawled (and therefore can waste crawl budget.)

But if there's a reason you can't specifically Disallow a page or URL pattern, the meta robots tag is a great option. It's also super simple - just this line of code in the <head> section of the page: <meta name="robots" content="noindex,nofollow">.

Learn more about Meta Robots tags.

Shopify's handling of ecomm faceted navigation via robots.txt files
Here's how Shopify handles faceted navigation indexation in their robots.txt file

Canonical Tags

Canonicalization tells search engines when you have a preferred version of a URL; it's the perfect solution for the parameter ordering issue from above that may have created duplicate pages.

Once you chose which you want to be ordered first, self-canonicalize the primary variation, and point the canonical to the primary page on the variation URL.

Example: assuming /dining-tables?shape=square&style=modern should be primary, the canonical tag (in the head section of the page) should be <link rel="canonical" href=""/>. This same exact line of code would then also be added to the variation (/dining-tables?style=modern&shape=square.)

Learn more about the Canonical Tag.

Internal Linking

Be careful linking to pages you don't want indexed. Href nofollow tags can be useful tools when you need to link to something but you don't want to pass link equity to it.

Where and how often you do this is honestly a matter of opinion/preference. Some eCommerce SEOs prefer to nofollow any links to any page that is meta robots noindexed, or disallowed in the robots.txt file. We typically only use it for URLs that Google absolutely should not crawl - e.g. any URLs that require users to be logged in (like cart and user account URLs.)


Faceted navigation can be great for SEO and the user experience - when implemented correctly. Carefully research what opportunities you want to take advantage of for your business, and marry that insight with the issues, tools and tips presented here to avoid SEO problems. If you want help with a custom SEO solution for your eCommerce website, reach out to The Gray Dot Company today! (See also our eCommerce SEO case study.)

Related posts: 

How to Handle Permanently and Temporarily Out-of-Stock Products for eCommerce SEO & UX

The eCommerce Technical SEO Framework: Making the Ambiguous Approachable

Work With Us
We’ll help craft, teach, and carry out SEO roadmaps that check all the boxes.