OK, so I scrape a list of several hundred keywords (done different sets of keywords when trying to harvest) and add them to the keyword section. I try different custom footprints such as "add your comment below", "inurl:edu" etc.. basically ones that result in millions of results in Google. Yet when I run the harvester, it brings back a few hundred results and then the connections just keep dropping before the harvester has finished. If I choose to leave keywords that weren't checked, it keeps most of them there, which basically means most of them aren't checked with Scrapebox. Proxies are public, but I use a decent source and they pass the Google check immediately prior to harvesting. I have results set to 1000, so always assumed this would mean if I had 1000 keywords with 1000 results, it would fetch back a max of 1 million.. not just a few hundred. I've also tried with Yahoo and it doesn't work out much better. I've seen threads around the net of people producing thousands and up to a million results, but I can't work out how they'd possibly do that. Has Scrapebox, or Google/Yahoo lowered the amount of results each IP address can get? I tried a delay of 10 seconds, which SB seemed to ignore, as there didn't appear to be any waiting. Maybe I need a lot more proxies? Recently, I've had about 40 and had connections ranging from 50 to the max 500.