So last night I submitted a sitemap.xml for a site of mine, and I just checked it a few minutes ago (following day), and it says 15 errors on all 15 links. In reality there is just 3 warnings on webmasters. Webmasters: http://puu.sh/5gGDy.png Robots: http://puu.sh/5gGFk.png I can understand why the about-me is blocked since I am still doing the page and I don't want it crawable/see-able yet. So what might be the issue? The other two things webmasters listed are crawable... Do I need a robots.txt on every directory or something.