Hello there everyone, Yesterday I started a harvesting operation on my VPS running scrapebox. It's the first time for me to harvest over 1 mil URL. Till this moment, the harvesting operation is still ongoing and it has now crossed 16 million. How I did that? Harvested over 200,000 competitor URLs and then used the custom "link:" operator. Now to the questions: 1. I was using around 200 public proxies with latency <800, now, harvesting process has now gone extremely slow, probably because the proxies are burnt. Now what can I do about it? Can't pause, get new proxies, then resume as far as I know. What do you suggest? 2. I noticed just 500,000 URLs harvested from Google used 7 GB of bandwidth while till now, I've crossed the 16 mil mark with Yahoo and used only 15.5 GB up till now. Why is this major difference between number of URLs and bandwidth used? 3. When scrapebox finishes, what will happen? I heard it can't harvest more than 1 mil URL per session so how come I've crossed the 16 mil till now? How do you manage those large lists? Thanks a lot!