May seem a stupid question but I'm really tired. After using scrapebox dup remove addon to merge my text files I ended up with a 1.21 GB text file (full of duplicates of course) and I'm not able to do anything about it. The addon won't complete the dup remove process. Says "reading file" then "writing unique URLs" and stays like this forever. I noticed a 10 MB files created on the desktop from it but that's it. The file is 10 mb and does not increase in size. Left sb for over 18 hours and same situation. "writing unique urls...." and a 10 mb file. Is it finished like that or what? Can you recommend a software that is able to deal with massive files and delete duplicate lines? txtcollector combines multiple files but doesn't remove dups. I don't mind if the app would take a long time, just get the job done! Thanks!