1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

How To Keep Track Of Total Scraped List For Gsa!

Discussion in 'Black Hat SEO' started by keith88, Dec 7, 2013.

  1. keith88

    keith88 Regular Member

    Joined:
    Sep 14, 2010
    Messages:
    287
    Likes Received:
    23
    Occupation:
    Internet Marketer
    Location:
    Home
    I'm sure this question is not real relevant to GSA but maybe someone can help!..

    (Important) Is there a way you can only import new urls in GSA when you scrape?

    For Instance....

    so if you have 100 urls that i scraped and uploaded to gsa last week. I want to scrape again this week and only give gsa the new domains. So lets say I scraped 20 new domains I would have 120 but I need gscraper to pull out those new 20 and leave the master list of 120 to compare my next scrape.

    So GSA wont run though duplicate domains wasting resources!

    Is there a way I can do that?
     
  2. keith88

    keith88 Regular Member

    Joined:
    Sep 14, 2010
    Messages:
    287
    Likes Received:
    23
    Occupation:
    Internet Marketer
    Location:
    Home
    Any input on this?
     
  3. Winternacht

    Winternacht Junior Member

    Joined:
    Jan 7, 2011
    Messages:
    113
    Likes Received:
    46
    make a txt of all urls that are loaded in SER and use scrapebox to compare this list on domain level to all new lists you scrape. if the list is bigger than 1mil urls use duperemove addon to merge all sitelists from SER
     
    • Thanks Thanks x 1