Note: There won't be much in this thread for experienced users of Scrapebox, this is purely to help newbies "see the light". No doubt any existing users already know the huge potential. I bought Scrapebox a few months ago purely so that I could blast comments on blogs. This tool is truly remarkable and packs so much more punch than I ever expected. By far this is the most versatile IM application that I have ever used. Although I can't yet post the link, there is a BHW discount code on these forums so that you can get Scrapebox for a one-off price of $57 Here are some of the ways that I use it (as well as many other people): Proxy Tool - I require proxies for several different tools, as most people here do. I use the SB proxy tool to both harvest new public proxies (based off custom harvest lists that I found on bhw) as well as for a proxy checking tool for fresh lists that are posted on forums. See the proxy section of the forum for more info on this stuff. Ping Tool - Import your list of URL's to ping and hit the ping mode. Find blogs, forums and wiki's to post to - Almost all my commenting is manual. I scrape lists of blogs, forums and wiki's for manual comments. All you need are search strings to find these various platforms. Find a unique identifier for the type of page you are looking for and scrapebox will pull the list of URLs for you. The wiki lists I also export to then use in a wiki-posting tool. PR Checking - I import lists of URL's into scrapebox to run a PR check over, but these can be any sorts of lists, for example a list of wiki sites that was posted. I just picked the best ones out. There are heaps of other times that you will find yourself with a list of URL's and want to know which ones are of the most "value" (completely unrelated to blog commenting). If someone posts a list of 100 Web 2.0 sites but I only want 20 then I run the PR check over the list and pick out those with the highest PR's. Checking for Dead Links - I import lists of URL's to find any dead links. This is handy if you have just had a gig completed and you are sent a list of "alive links". I run the list through the Scrapebox alive check to confirm that the numbers are similar to what has been stated. Remove Duplicates - Yet another simple but handy one. When running any new lists through tools I kill off the duplicate URLs and sometimes just all duplicate domains. Depending on the size of the list it may be worth running it through the PR check too. Although I could do this in Excel, this way is much easier. Trim To Root - This sounds so basic, and it is, but has saved me heartache. You may have a list of URL's with a long string, but an application only needs the domain, or only to the lowest folder instead of the whole string. Import your list and hit a button and it is trimmed. Although I could do this in Excel, this way is much easier. Outbound Link Checker - Run a list of URL's through this to filter out those with a high number of outbound links which search engines may see as spam. After these are filtered then run the remove duplicates tool. Link Extractor - You have found a website with a huge list of URL's on it, maybe it is a link exchange or a directory of some nature. Run the page through the link extractor to get the full URL list. Then link this up with maybe the alive checker and PR checker tools. There are so many different SB techniques that I have learned from these forums. I hope that this thread has opened the eyes of anyone "thinking about buying Scrapebox". For a one-off low price this is truly a must have tool. Scrapebox is only limited by your imagination.