I got a 25 million list I harvested using gScraper and I am trying to delete duplicate domains, easier said than done with a 4gb file I attempted to delete the duplicates with gScraper and it crashes after taking up 14GB of ram lol. Tried Xrumer and it gives a error. Too big. It's too big for Notepad ++ to open fully. Tried splitting it with gSplit but it's too big for that too. I need a easy way to split big files if anyone could recommend anything that would be great.