Index, follow versus disallow

onedaydayone

Regular Member
Joined
Aug 5, 2019
Messages
220
Reaction score
48
If I include these lines in the robots.txt file:

Disallow: /*add-to-cart=*
Disallow: /*per_row=*
Disallow: /*shop_view=*
Disallow: /*per_page=*
Disallow: /*remove_item=*

Does it mean that Google won't crawl those pages and won't make them live (indexed/follow), even if they already have 'content='index, follow' tags?

In this situation, does the disallow function serve as the more important and primary indicator for Google to inspect?
 
yes, the bots won't simply crawl the pages, in majority of cases at least,
while nofollow pages will be crawled and the links included will be considered,
and possibly pass some link juice or/and used for crawling

so to answer your question, yes disallow a page in robots.txt will simply prevent the bots from crawling it,
so any meta tag included won't really matter
 
yes, the bots won't simply crawl the pages, in majority of cases at least,
while nofollow pages will be crawled and the links included will be considered,
and possibly pass some link juice or/and used for crawling

so to answer your question, yes disallow a page in robots.txt will simply prevent the bots from crawling it,
so any meta tag included won't really matter
Yes as @hideath said there is no possibility that Google will crawl the pages with disallow in robots.txt. but, there are possibilities to make a page appear in search results despite of being disallowed in robots.txt. it could be done if there are links pointing to those pages from other sites.
 
If I include these lines in the robots.txt file:

Disallow: /*add-to-cart=*
Disallow: /*per_row=*
Disallow: /*shop_view=*
Disallow: /*per_page=*
Disallow: /*remove_item=*

Does it mean that Google won't crawl those pages and won't make them live (indexed/follow), even if they already have 'content='index, follow' tags?

In this situation, does the disallow function serve as the more important and primary indicator for Google to inspect?
If you put those lines in your website's settings, Google might not check those pages, even if they have tags saying they should, making the "Don't check" rule more important for Google

And yes, the "Disallow" function takes precedence where Google listens more to that than other requests to look at those pages.
 
as @Paran01d said though your site URL will display on SERP in few cases like if it has links from external site, if it has been included in the site map, Direct traffic to the URL and so on.
 
BUT if I change the tags from 'content='index, follow' to 'content='noindex, nofollow', I will prevent all paths from appearing on the SERP, even if they have external links or are included in the sitemap?
 
Yes if you change the tags, the pages will not appear in the SERP even if they have external links pointing to them or are included in the sitemap.
BUT if I change the tags from 'content='index, follow' to 'content='noindex, nofollow', I will prevent all paths from appearing on the SERP, even if they have external links or are included in the sitemap?
 
Back
Top