1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Question about using GSA to scrape links itself

Discussion in 'Black Hat SEO' started by mynameis940, Jul 7, 2014.

  1. mynameis940

    mynameis940 Power Member

    May 1, 2011
    Likes Received:

    I have purchased a list of keywords (big one) and I'd like to use it on my GSA to scrape for links. So, I have imported 300,000 keywords for scraping, so I was wondering is it a good idea to do this? Will it slow down GSA or does it have any cons? What is the recommended amount of keywords I should use for scraping with GSA?

    Thanks in advance.
  2. bartosimpsonio

    bartosimpsonio Jr. VIP Jr. VIP Premium Member

    Mar 21, 2013
    Likes Received:
    Home Page:
    In my experience the more keywords, the longer your project will run without running dry. GSA has thresholds for instantaneous threads and won't let the amount of kw's slow you down.

    Now, if you only add irrelevant and unrelated kw's, that may slow it down in the sense that it'll be working to scrape useless data.
  3. divok

    divok Senior Member

    Jul 21, 2010
    Likes Received:
    SER mostly scrapes google when there are no more targets to post . We have no control whatsoever regarding this . So by the time ser starts scraping all your public proxies might have died by then . Making scraping useless . It would be like firing blanks .

    Also You could start a dedicated scraper in gsa using options > advanced >tools , but again we don't know how many threads it will be using, again we might be wasting precious proxies if less amount of threads are allocated to it.

    You can test it yourself by writing logs to a file , you will find proxy banned by google messages very often(I forgot the exact msg, haven't seen it for a long time) .