Nofollow on Parameters
I typically avoid using nofollow on parameters, and the HTML tag, instead I try and fix these through the robots.txt file, this can ensure there is a fallback site-wide, whereas it’s much easier to forget adding an HTML tag.
URL parameters can be an overlooked aspect of SEO, causing issues with URL bloating and impacting crawler bandwidth. Learn how to identify and fix URL parameter issues through SEO audits and handle them effectively using robots.txt rules.
Shane Parkins has spent over 14 years in search, working across agencies, in-house teams, and as a freelancer before founding Climb & Conquer. His expertise covers technical SEO, paid search strategy, and resolving complex site structure issues that impact rankings.
This guide breaks down how URL parameters affect crawling, indexing, and site performance. Mismanagement leads to duplicate content, wasted crawl budget, and ranking fluctuations. Shane shares best practices to handle parameters effectively while keeping search engines and users on the right path.
When it comes to URL parameters, in my experience it often requires an experienced SEO to carry out effective diagnostics. Parameters by nature often imply advanced function of the site, this could be session ids, filtering or triage. As URL parameters can often go unchecked by the average business owner I often find a lot of opportunities with a robust audit, especially if the CMS is not a mainstream platform or if you have set up WordPress without SEO features (see our WordPress SEO services).
URL parameters can also cause secondary issues, which can be quite serious, but URL bloating – impacts crawler bandwidth. There are some sites I’ve worked on that have over 500,000 URLs with parameters, that have had no remedial work on them, causing untold damage to their organic rankings and visibility. Some platforms will have URL parameter handling built-in, for example in WordPress changing the URL handling to permalink rather than the hostname can help with the URL structure.
There are a few options here but given the nature of these issues, it really can pay to commission an SEO expert to carry out an audit. The first place I’d recommend looking at is carrying out an SEO crawl, using software such as Screaming Frog, SEMrush or SiteBulb. The tool should highlight excessive URLs if parameters are rife on the site, for example, if you have a small site but the crawler has picked up, say 10k+ URLs, all with parameters. You can often filter or export the URLs – identify ones with ? or & in there.
The second method is to look into the Google Search Console > Coverage area, here you can see if Google is having a hard time processing a lot of URLs, you may see a lot of parameter URLs in Crawled or Discovered – Not Indexed. You can also do something similar with Bing Webmaster Tools. A final step could be to go into Settings > Crawl data within Search Console, this will highlight how many URLs are being crawled per day, if there are spikes completely out of the normal then it could be due to parameter handling.
The hardest part of fixing URL parameter issues is often diagnosing what the parameter is actually doing. If it’s something like a session ID or perhaps looking to triage someone if they are logged in, it can be difficult to unravel the functionality. For example, if you have a triage URL for login-in users available on every page, it can cause a substantially high error occurrence rate on the URL crawler. The fixes around these types of errors can vary from site to site, if functionality fixes are not available then perhaps look at handling some bespoke rules into robots.txt file.
The first port of call is to assess if the page holds relatively unique and valuable content, for example on an eCommerce brokerage page you may have a product range and a brand filter, which appends ?brands=2 onto the URL. It may be a beneficial fix to incorporate /site/brandname/ rather than a parameter, which can hold better SEO value with it being a word rather than an ID. So always focus on URLs that have value to be served as static URLs rather than parameter or dynamically changing content.
I typically avoid using nofollow on parameters, and the HTML tag, instead I try and fix these through the robots.txt file, this can ensure there is a fallback site-wide, whereas it’s much easier to forget adding an HTML tag.
Take the necessary steps to unravel bad functionality, always. Excessive URL parameters can be symptomatic of a badly structured and coded website.
Parameters can have great value to user experience, filtering for example (if the filtered option is not holding search volume), try and find the balance of SEO and UX.
"*" indicates required fields
If you’re unhappy about the results of your performance, let’s have a non-sales chat about how we could help.
We’d love the chance to hear more about your business.