Robots.txt is the WORST way to block them. It's basically just asking crawlers not to crawl your site, but it's up to them to do it or not, and bot companies need the data to survive... And you're leaving a footprint.
If you don't have your own script, at least do the checks in an htaccess file.
The best way to do it is to trick them into thinking they are viewing a legit page, if you block them by user agent and send them an error or redirection code as a response, some might try to re-crawl your site disguised as a regar user or even as google bot.