Say for instance I have a huge auto approve list(100k+ URLS) and there are ALOT of duplicate domains/urls all in the same list. For example: domaina.com/blue-widgets domaina.com/red-widgets domaina.com/green-widgets Is it best practice to use the split duplicate domain function of scrapebox and split these duplicates down into individual lists for posting? Or can I just keep these different urls with a duplicate domain in the same list and post to them without regard? I'm thinking it might be better to split the list down, but then you run into the problem of having really big lists and there being multiple duplicates(40+) in the same list. Then it would become really time consuming to split the list down. Any advice on this?