I've been irritated with scrape box public proxies simply because I scrape keywords lists that I generate which are anywhere between 10 to 400k keywords long. Doing this with public proxies, I scripted a program to scrape proxies, save, post, then stop harvest ... etc reload and re-harvest until they are completed. This works great, but is ridic. SLOW. So, I wanted to figure out what exactly was slowing my proxies down. We've heard "special modifiers" will slow down your search.. but by how much? To my surprised... effing alot. Your public proxy, if searching ANY special modifier will be banned after 6-12 consecutive searches... (tested using my private proxies) So, let's take this into considering if your keyword list has any special modifier. A list of 17000 proxies takes about 30-40 minutes to test. Basically, by the time you use your proxies, I'd guess there is a 80% chance it's already banned using special modifiers. So, for those going "Public proxies suck" and your keyword list contains things like inurl: intitle: filetype: etc.... Google has essentially made it nearly impossible to do large scale searches using these modifiers.