Ok, so I'm almost at the point of having this thing ready to run for the most part without any user input. I've got the program using proxies in rotation. It uses a .csv file with the keywords to know what keyword to scrape data for. It builds the search query. I have it sleep the thread between 2 and 20s for each query. On proxies that return a 503, it will bring up the google captcha, allowing myself to enter the captcha and then return the set of cookies back to the program which are then stored with each proxy server so that on further requests, it will pull down the data I need without getting a 503 error. I also have it capture this additional information that it adds to the end of the url string sometimes...here's and example: "&gbv=1&sei=GXg-T5adE9PU4QTSv5CbCA" I have it append that to any subsequent urls in future requests... However, now I'm at a point that I can't seem to get ANY data. It will do maybe 6-7 total requests (about 2 each proxy), before just timing out. Which I've set to 2 minutes. I've analyzed the HTTP headers to be sure that they're identical to my browsers. Should I not be appending that above string info more than one time? Any information that can help would be appreciated?