Got this message back in june: Headline Googlebot cannot acces your robot txt: Over the last 24 hours Googlebot found 15 errors while trying to access the robots.txt. To ensure that the pages listed in the file was not scanned, was crawling prone. The overall error rate for robots.txt on your website is 60.0%. I opende my robots.txt and found this: User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Sitemap: http://www.mysite.com/sitemap.xml.gz How can I fix this? What should I include in my robot.txt?