My usual process is to: 1. use scrapebox to scrape a list of sites for one of my seo tools 2. trim to root 3. remove duplicate domains 4. remove domains with bad keywords in them 5. I'd say run alive check here, but lately I haven't even done that. I've noticed alive check works only with high quality proxies, in large numbers, which I don't have. When I run it without proxies or with a small number of proxies, I get thousands of "DEAD" results for sites that are really "ALIVE" and its killing my lists, so I stopped doing alive checks. 6. At this step I plug my list into my software, do a run, and then use the successful submissions for my list. The problem is, I'm scrapping lists of 100k+ and it's taking my bots and tools forever to run through the whole list. Are there some better ways to trim down the results before doing the test run? Am I using alive check wrong and it does actually work? Any help is greatly appreciate, thanks guys.