Hi, I am trying to Scrape with Gscraper using 10 semi dedicated proxies from buyproxies. I am a bit new in this and am mainly targetting to gather expired web 2.0 so, a lot i mainly use footprint 'site:.domain ' but my problem is, proxies are dying real fast, within 5 minutes of scraping with a maximum scraping speed of 25k per minute. All of them. Is there any preferable settings that I might be missing I'm Gscraper which would let me Scrape Google a bit longer? Also, is 25k per minute is fine or I can be more efficient in this?