1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

XML Sitemap Issue in Webmastertool

Discussion in 'White Hat SEO' started by gauravkth, Aug 22, 2015.

Tags:
  1. gauravkth

    gauravkth Registered Member

    Joined:
    Dec 5, 2014
    Messages:
    52
    Likes Received:
    0
    Gender:
    Male
    Occupation:
    SEO Consultant & PR Guest Posting Expert
    Location:
    Bangalore
    Home Page:
    I checked my webmaster tools robots file and got this message

    On line 2 message is showing - Rule Ignored by Googlebot.
    User-agent: *
    Crawl-delay: 1
    Disallow: /wp-content/plugins/
    Disallow: /wp-admin/

    What does it mean and how to solve it?
     
  2. metafser

    metafser Regular Member

    Joined:
    Jul 20, 2014
    Messages:
    449
    Likes Received:
    268
    Gender:
    Male
    Occupation:
    Digital Marketing Influencer
    Home Page:
    On April 21st, 2015, Google is going to change the way it ranks sites for users on mobile devices. By blocking Googlebot from your plugins folder, you could be preventing Google from deciding that your site is mobile-friendly. If you are skeptical about this Google-is-changing statement, read this. Google need to crawl your plugins folder. Plugins often contain CSS or JS files, and those files are necessary to understand what the page actually looks like.
     
  3. Ambitious12

    Ambitious12 Elite Member

    Joined:
    Jun 26, 2014
    Messages:
    3,096
    Likes Received:
    609
    Occupation:
    No Occupation
    Location:
    Among the Stars
    YOu should once contact with Zwielicht
    he is very good in Webmaster.
     
  4. dsan996

    dsan996 Regular Member

    Joined:
    Apr 18, 2014
    Messages:
    276
    Likes Received:
    134
    Location:
    Depends on the day
    +1. Zwielicht knows this stuff very well.

    Anyway OP, here are my two cents. That notification you are seeing refers to the "crawl delay" rule in your robots.txt file. That specific rule is ignored by Googlebot so there is no real issue there, they are just telling you that the rule will have no effect.

    Trying to modify the crawl rate for your site is not recommended unless you notice that the crawls are slowing your server down (which is very uncommon, I've seen that once caused by the Bing crawler on a big site that was hosted on a low resources server, but never with Googlebot).

    If you don't mind that and you still want to reduce the crawl rate you should remove that line on your robots and go to the crawl rate section in your webmaster tools settings (the gear icon). Then select the option "Limit Google's maximum crawl rate" and you will be able to adjust it.
     
  5. gauravkth

    gauravkth Registered Member

    Joined:
    Dec 5, 2014
    Messages:
    52
    Likes Received:
    0
    Gender:
    Male
    Occupation:
    SEO Consultant & PR Guest Posting Expert
    Location:
    Bangalore
    Home Page:
    Thanks for explanation but few time back i had changed my robot.txt and uploaded on server again. But after 2 weeks its again started showing on webmaster tool . After changing the robots, it was showing the same that i made changes but few days later it was started showing like this below.

    User-agent: *
    Crawl-delay: 1
    Disallow: /wp-content/plugins/
    Disallow: /wp-admin/
     
  6. dsan996

    dsan996 Regular Member

    Joined:
    Apr 18, 2014
    Messages:
    276
    Likes Received:
    134
    Location:
    Depends on the day
    Are you manually uploading the robots.txt file to your root folder via FTP?

    It sounds to me like a WP virtual robots.txt issue. Some plugins add rules like that to the Wordpress virtual robots file but they can be overriden by uploading a manually created robots.txt to the server.

    If you have already manually uploaded a robots.txt file without the crawl delay line but you can still see the rule when you visit www.yoursite.com/robots.txt, then your server can be overriding it. Some hosting companies add the crawl delay line on shared hosts plans to save bandwith. In that case you should directly contact your hosting provider and try to ask them to remove the rule.