Hi Guys, I need an advise on which software is the best for checking external links in huge domain database. Lets say I have a list of 2 000 000 domains. I need a software to: 1. craw all that domains 2. find all external links 3. save the data in a file. If one of the domains in the domain list is: example.com . The software needs to 1. crawl : example.com and all the internal pages of example.com (2 level deep) ,e.g.: example.com/contact ,example.com/about-us ... 2. find all external links on these pages, e.g. : wordpress.com/how-to, google.com/news ... 3. Save all these external links url's in a file 4. Move to next domain in db. Can you recommend me a software to do this job? I'm planning to buy ScrapeBox + expired domain finder plugin, but not sure if is what I need for. Also what time and resources are needed to finish such a big task?