I'm trying to index check 3k URLs against google in scrapebox but by 1200 URLs or so it starts to give a ton of errors. Before that point, no errors. I am using 10 private proxies, Index thread count set at 50. Wondering if anyone here has found decent settings to use when index checking thousands of URLs at a time with minimal errors, i.e. # of proxies per thread count etc that worked well, as no matter what I try, I can't avoid massive errors.