Hi guys, I have set up a Google webmaster account for a fairly new domain. I was usually not wanting to get one as I read on here that generally, when using BH methods like using ScrapeBox etc you should not use it (for whatever reason) Anyway, I signed up and went to submit my sitemap. I have a WP site and I used to plug in that makes a sitemap for me in .xml It is located at www.domainname.com/sitemap.xml I told Google the location of this and it downloaded it but there is an "X" below it and when I click this X, it tells me: " URL restricted by robots.txt " This is likely the reason why all of my sites are taking so long to get indexed 'naturally' and require me to ping them ,etc to get them indexed (as they are all running on WP with the same install and same theme so I am assuming they all have the same problem) I know that the robots.txt tells/sets instructions for robots.. but I have no idea on how to edit it?? Can anyone point me in the right direction, or perhaps a 'standard' robots.txt file that works well. I just want the search engines to index my stuff, nothing fancy. Thank you and rep will be given for a truly helpful reply!