Hi, Need someone to help me. I want to Check and Collect invalid links from webpages automatically, how can i do that? without the copy-paste thing. Say i have many urls inside a .txt file. http://www.pressreleasesite1.com/tech/0023432421 http://www.pressreleasesite1.com/tech/0023432422 http://www.pressreleasesite1.com/tech/0023432423 http://www.pressreleasesite1.com/tech/0023432424 http://www.pressreleasesite1.com/tech/0023432425 http://www.pressreleasesite1.com/tech/0023432426 etc. arround 10.000 urls or more inside 1 .txt file The script/tool must check all links inside those webpages (each webpage may have 5 OBL or more), and collect all invalid links from those webpages (only the invalid ones) Then the script will automatically generate a new .txt file contains all invalid links. Any help is appreciated. i even want to pay if someone can get this done. or anyone know a good tool which can do this task? i do search on google with no luck. a big THANK YOU!