I get the basic idea of Robots.Txt, but I don't know exactly what to block and what to allow. I run a forum and I use a generic Robots.Txt file. However, when I check my Google webmasters account it appears that its blocking all sorts of pages, some of which I don't think should be blocked. But, at the same time my pages seem to be showing up in the SERPS with the correct Meta descriptions. Is it okay to just not have a Robots.Txt? All, it seems to do is cause problems.