How to Block Dynamic URL’s

blocked url in robots txt file test in google search console

Dynamic URL’s can appear on your website in many different ways. If you have an internal search box on your site, you will see that the URL might contain a question mark which stands for a query based action after a search has been performed. You may have an e-commerce site that allows the user to change the page by size and color. When that happens, you might see a dynamic URL that performs those intended actions by the user. Today, I am going to show you how to block these pages from being indexed and shown in the SERPs for Google and Bing.


If you are not familiar with what a robots file is, you should check out this guide. If you are familiar with the robots file, we are going to disallow this URL slug in the file. Depending on your CMS and hosting, you will have a different amount of steps to get to this. Sometimes you can make this change right into WordPress with Yoast, but I am going to show you how to edit your robots.txt file in GoDaddy.


Example of a dynamic URL


On TM Blast, I have dynamic URL’s that generate based on the search box a user can use. Based on what they search for, they are shown a URL like this. In fact, if you ever use Google, Bing, YouTube, or any other search site, you will notice that the URL will change to a dynamic URL after you have searched for something.


example of a dynamic url in a website

For this example above, I can see the URL slug has a question mark and the letter “s”. If I do more searches like that on my site using the search box, I will continue to see the URL always starts with that part.


How to Block a Dynamic URL with a Robots.txt file


  • Use a Tool to Crawl Your Site

If you are unsure if you are generating dynamic URL’s, I would recommend using Screaming Frog to crawl your site. You can also use Google Search Console and Bing Webmaster Tools to see what pages are indexed and receiving traffic.


  • Identify the parameter to block

In the example earlier in the post, you want to find the part of the URL that creates the dynamic part of the URL. Chances are you will see a question mark, an equals sign, or something else that looks a bit odd compared to the rest of your URL’s on your site.


  • Update the robots.txt file

This depends on where your site is hosted. If you have SEO Yoast, you can edit the robot.txt right in WordPress. However, I use GoDaddy which is where my robots file is located. To edit this in GoDaddy, you need to head over to your manage files section. From there, you are going to click on File Manager. From there, you will see your robots.txt file in the main listing. Right click on the file to edit this. See this picture below if you are lost to where I am looking. If you are lost, they have excellent customer service that can direct you to the right place.


robots txt file in godaddy

Once you click on edit, you will be prompted with a warning message about what you are doing. That is ok, you want to click on edit. You will then be in your robots file for your site. Since we want to block the dynamic content, we want to write the code with the Disallow rule. I put an arrow to how I wrote mine. As a note, I choose to put an asterisk at the end of my code. I do that as a wild card rule to tell Google and Bing to also make note to block any other content variation that might come at the end of this URL. It’s more of a way to be extra careful. Once you write the rule, you want to click on save. That’s it, you blocked that content from being accessed by the search engines!


how to update your robots txt file in godaddy easily


  • Test this Block in Google Search Console

Once you have added the disallow section, you can test this out in Google Search Console. Head over to the robots.txt Tester tool and put in a sample dynamic URL to see if this has been blocked by Google. If you see the blocked message, you are good! If you use this tool and it says allowed, you might just have to wait for the change to go through, but if you put it into your robots file, you will be all set!


blocked url in robots txt file test in google search console




There are many reasons why you want to disallow these dynamic pages from the SERP. For one, you are inadvertently hurting your site’s performance. Google and Bing have to crawl and index these pages which could result in less important pages being crawled. This can have ranking decreases due to less time for the spiders to crawl the site.


Second, you will have messy URL’s in the SERP which look like spam. If the URL has some crazy parameters on it, you might see a drop off in the click-through for terms that rank for that page. Even if the rank for the term stays consistent, a drop in the click-through rate will be noticeable very quickly.