Hey guys, I've just recently submitted a sitemap for a new domain(using Wordpress and some Sitemap plugin), and it seems that every one of the URLs are giving errors. It says Type: Warning: Issue: URL blocked by robots.txt Description: Sitemap contains urls which are blocked by robots.txt. Then it starts listing all of my domains. Does anyone have any idea what's going on here? or just an issue on Google's end? Here's what I did so far... 1. Settings->Reading->Unchecked "Discourage search engines from indexing this site" 2. Disabled all plugins (which was only my google sitemap plugin) 3. Tried numerous combinations of robots.txt, but here is the current: Code: User-Agent: * Disallow: User-Agent: Googlebot-image Disallow: /wp-content/images/ Sitemap: http://www.mysite.com/sitemap.xml 4. Tried adding in a non-www to www 301 redirect 5. Tried changin my URL to non-www both within GWT and Wordpress panel. Any ideas?