So one piece of advice I have heard here over and over is to not use GSA SER's own footprints. You should scrape your own list using SB. I think this is valid advice because you dont want sites that are spammed to death by everybody else. now the problem is, when you scrape your own list, how do you know if GSA SER would not also scrape the same things? how do you exclude those? to clarify, I will give an example. let's say I use SB and I get a list: site555.com/wordpress black-shoes.com/express-engine chiioueu.net How do I know that the above list is not something GSA SER would scrape? how do I know the above list is not spammed to death by everybody else?