Improper Permalink Structure?

prodon

BANNED
Joined
Jan 31, 2023
Messages
696
Reaction score
195
In my GSC -> pages section, there are permalinks that I have never created, for example – ?filter=random, index.php, etc. How to stop them from creating as these links are giving me a 404 error. I have already changed the permalink structure of the site to post name. I am using RankMath and redirect all 404 page errors to my site homepage but then, I am getting ‘Page with redirect’ and Alternative page with proper canonical tag issue in GSC. Does anyone have any solution?

Screenshot_36.png
 
To prevent unwanted URLs like "?filter=random" from appearing in Google Search Console, use robots.txt to disallow crawling of parameters or implement URL parameter handling in GSC settings.
Can you guide me on how to do this? Also, will any feature of my site will be affected if I will block these URLs?
 
Probably a hidden 'sort by' function with anchor tags within the source code, these anchor tags should be rel="nofollow" or removed completely, blocking them within robots.txt would help but they never should be created in the first place if they lead to a 404. It's better to fix the issue by fixing the source of the inlinks.
 
Blocking unwanted URLs like "?filter=random" in robots.txt or GSC won't affect your site's core features but may improve site health by preventing irrelevant pages from appearing in search results.
Can you please guide me how to do it
 
Probably a hidden 'sort by' function with anchor tags within the source code, these anchor tags should be rel="nofollow" or removed completely, blocking them within robots.txt would help but they never should be created in the first place if they lead to a 404. It's better to fix the issue by fixing the source of the inlinks.
What changes do I need to make to block them?
 
What changes do I need to make to block them?
Personally I would fix the source of the inlinks (if they were never meant to be generated), but if you want to disallow them in the robots.txt you can use:

Code:
user-agent: *
disallow: /*?filter=

Like mentioned above, it's better to fix the source and Google will eventually not crawl the links anymore if it is a 404

To do this you can use Screaming Frog or search console to find the source, but the source is likely the URL without /?filter=... then -> view the page source -> search for "?filter=" -> find where these links are generated -> remove code block from the code editor of your site or add rel="nofollow" to the anchor tags if you want to use them in the future.
 
Back
Top