Can anyone familiar with gsa ser think of a solution to this problem? yesterday I created 30k+ links across 10 projects but only 700 unique domains? Even when I changed settings to not allow posting on same url more than once I still didnt get many uniques?? I have all search engines selected and 15,000 keywords for each project to search for links?? Thanks in advance!
There is an option "post to same domain more than once" - Or something to that effect. Unclick it, jobs a good-un...
I suggest you keep that keep using that option if you want more links. If you look at the results of most GSA blasts, you'll find that there are way more links than there are unique domains. It's just way more effective that way.
If you're not using any kind of list but rather just posting directly as you scrape links it's quite possible that all 10 of your projects were scraping the same URLs. Do you have similar keywords in each project? You should probably look into getting a verified list rather than scraping or in the very least, have a few projects scraping / testing links and dumping verifieds into a single list which you then feed to your blasting projects.
I second this, 15k keywords is not that many either. Make sure your keywords list is varied as well, i would say this is a no: 1. Dog training 2. Dog training in Boston 3. Dog training guide This will bring a lot of repeating results, hence will bring you little to none uniqueness in your scrape. Be creative and work on your keywords list so it can draw bigger batch of unique domains, ah yea, footprint too.
Use a verifieds list for better results. How are your proxies for scraping and do you use global lists as well?