Hi guys. For a while now I've been messing with google webmasters, and it always shows that my robots txt is blocking my sitemap. I've tried editing it, but it keeps going back to this default after a few hours. HTML: Sitemap: websitedotcom/sitemapdotxml User-agent: * Disallow: / I've also deleted the robots file from my directory and re-created it, and the same result happens. Also with my sitemap xml file, webmasters has been saying that there is an xml tag missing. The parent tag is urlset and the tag is url. This is just a small snippet of my xml file. HTML: <?xml version="1.0" encoding="UTF-8"?> <urlset <url> <loc>hyperlinkwebsitedotcom/indexdothtml</loc> <lastmod>2014-02-02T07:00:01+00:00</lastmod> <changefreq>weekly</changefreq> <priority>0.80</priority> </url> </urlset> It also comes with a heading which I have also tested by deleting a few times. So far nothing has changed. Any help would is much appreciated.