Ulrs restricted by robots.txt please help !

minhoba

BANNED
Joined
Oct 25, 2008
Messages
49
Reaction score
12
Hi gurus,
I have problems with my blogspot. After i verified my blog with Google in Webmaster Tools, i found out that I have 35 urls restricted by robots.txt
Google was unable to crawl the URLs due to a robots.txt restriction: your robots.txt file might prohibit the Googlebot entirely; it might prohibit access to the directory in which this URL is located; or it might prohibit access to the URL specifically. Often, this is not an error.

I use Google blogspot to create my blog (with my own domain), and I don't put any robots.txt file in, so I don't know how to fix this problem. Please help me to fix this prb, i really thank you for your help.
 
Are you using a cookie stuffer? I know CS uses an optional robot to help generate fake hits. But if you arent im not sure what the hell that is. Make a backup of the file on your server and delete it and see what happens.
 
I don't use cookie stuffer and i hosted it in Google blogspot, should i delete the blog code ?
 
Those URL's are the one found in your category links, so it's okay.

They are automatically restricted in robots.txt by blogger to avoid having duplicate contents in the eye of Google spiders
 
Back
Top