Hi, I have a question regarding the SB remove duplicate domain functionality. If the scraper has 10 URLs from the same domain, and I remove duplicate domains, 9 of the URLs will get deleted. That is such a shame. Is it possible to delete, at random, the 9 links so next time, in 1-2 months, I can retry to post on a different U R L at the same domain? Example: abc . c o m / 1 abc . c o m / 2 abc . c o m / 3 I want to be able to post to all 3 (one at a time over a 3 month period). Thanks.