1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

[WP]Google Webmaster tools error message

Discussion in 'Blogging' started by Kaiser_Soze, Feb 14, 2013.

  1. Kaiser_Soze

    Kaiser_Soze Junior Member

    Joined:
    Nov 24, 2012
    Messages:
    151
    Likes Received:
    28
    Occupation:
    IT Manager
    Location:
    EU, Bulgaria
    Home Page:
    I do not have the slightest idea how to fix that. I am currently using Yast and this plugin for building a sitemap: http://wordpress.org/extend/plugins/google-sitemap-plugin/
    Code:
    [COLOR=#000000][FONT=Arial][B]***/: Googlebot can't access your site
    [/B][RIGHT]February 12, 2013[/RIGHT]
    [/FONT][/COLOR]
    [COLOR=#000000][FONT=Arial]Over the last 24 hours, Googlebot encountered 1 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%.
    You can see more details about these errors in [URL="https://www.google.com/webmasters/tools/crawl-errors?siteUrl=http://www.scifi-real.com/#t1=2"]Webmaster Tools[/URL].
    [B]Recommended action[/B]
    If the site error rate is 100%:
    [LIST]
    [*]Using a web browser, attempt to access [URL="http://www.scifi-real.com/robots.txt"]http:/***/robots.txt[/URL]. If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot.
    [*]If your robots.txt is a static page, verify that your web service has proper permissions to access the file.
    [*]If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure.
    [/LIST]
    If the site error rate is less than 100%:
    [LIST]
    [*]Using [URL="https://www.google.com/webmasters/tools/crawl-errors?siteUrl=http://www.scifi-real.com/#t1=2"]Webmaster Tools[/URL], find a day with a high error rate and examine the logs for your web server for that day. Look for errors accessing robots.txt in the logs for that day and fix the causes of those errors.
    [*]The most likely explanation is that your site is overloaded. Contact your hosting provider and discuss reconfiguring your web server or adding more resources to your website.
    [*]If your site redirects to another hostname, another possible explanation is that a URL on your site is redirecting to a hostname whose serving of its robots.txt file is exhibiting one or more of these issues.
    [/LIST]
    
    [/FONT][/COLOR]
    [COLOR=#000000][FONT=Arial]After you think you've fixed the problem, use [/FONT][/COLOR][URL="https://www.google.com/webmasters/tools/googlebot-fetch?hl=en&siteUrl=http://www.scifi-real.com/"]Fetch as Google[/URL][COLOR=#000000][FONT=Arial] to fetch http:***.com/robots.txt to verify that Googlebot can properly access your site.[/FONT][/COLOR]
     
  2. Oukast

    Oukast Senior Member

    Joined:
    Jan 11, 2012
    Messages:
    832
    Likes Received:
    683
    Location:
    Under the palm tree
    I've had this same issue randomly on new domains/blogs set up even on the same server as ones that do work (aka. can be fetched by Google bot just fine). Haven't found any ways around it, but they've all started to work within few days.
     
    • Thanks Thanks x 1