1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Scrapebox: Add harvested URLs to blacklist so they are skipped next time?

Discussion in 'Black Hat SEO Tools' started by muchacho, May 27, 2011.

  1. muchacho

    muchacho Supreme Member

    Joined:
    May 14, 2009
    Messages:
    1,293
    Likes Received:
    187
    Location:
    Lancashire, England.
    I've been thinking of maybe adding all my harvested URLs to the blacklist to save time in the future, as SB won't harvest the same URL twice.

    Has anybody tried this and were there any drawbacks, apart from if you happen to lose your master AA list?
     
  2. Flurbuff

    Flurbuff Regular Member

    Joined:
    Jun 17, 2010
    Messages:
    227
    Likes Received:
    94
    You'd maybe still want to periodically scrape previously scraped sites for new pages. Not sure what else beyond that.
     
  3. muchacho

    muchacho Supreme Member

    Joined:
    May 14, 2009
    Messages:
    1,293
    Likes Received:
    187
    Location:
    Lancashire, England.
    Yeah. I'm wondering whether memory/speed would be effected too, with SB having to quickly sift through potentially millions of URLs each time it's harvesting.