1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Robots.txt submission issue in Google Webmaster Tools

Discussion in 'Black Hat SEO' started by samiejg, Jan 27, 2014.

  1. samiejg

    samiejg Senior Member

    Joined:
    Dec 14, 2013
    Messages:
    821
    Likes Received:
    49
    Hey guys,

    I've just recently submitted a sitemap for a new domain(using Wordpress and some Sitemap plugin), and it seems that every one of the URLs are giving errors.

    It says
    Type: Warning:
    Issue: URL blocked by robots.txt
    Description: Sitemap contains urls which are blocked by robots.txt.

    Then it starts listing all of my domains.

    Does anyone have any idea what's going on here? or just an issue on Google's end?

    Here's what I did so far...
    1. Settings->Reading->Unchecked "Discourage search engines from indexing this site"
    2. Disabled all plugins (which was only my google sitemap plugin)
    3. Tried numerous combinations of robots.txt, but here is the current:
    Code:
    User-Agent: *
    Disallow: 
    
    User-Agent: Googlebot-image
    Disallow: /wp-content/images/
    
    Sitemap: http://www.mysite.com/sitemap.xml
    4. Tried adding in a non-www to www 301 redirect
    5. Tried changin my URL to non-www both within GWT and Wordpress panel.

    Any ideas?
     
    Last edited: Jan 27, 2014
  2. JustUs

    JustUs Power Member

    Joined:
    May 6, 2012
    Messages:
    609
    Likes Received:
    450
    Robots is read top down.

    Your entry User-Agent: * Disallow: should be under the allow entries.

    You are telling all websites not to crawl.
     
    • Thanks Thanks x 1