1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Need help. Blocked by Google while SERP scraping.

Discussion in 'Black Hat SEO' started by Nemirion, Mar 24, 2014.

  1. Nemirion

    Nemirion Newbie

    Joined:
    Mar 24, 2014
    Messages:
    1
    Likes Received:
    0
    Hello everyone!

    I have a question concerning Google SERP-scraping.

    I have multiple class-D networks. During the last 3 months from now, I was able to get the SERPs without being bocked by Google a single time, as the scraping was very slow.
    A few days ago, all class-D networks were blocked simultaneously within some minutes and I got a site instead, where you have to enter a captcha. Even after waiting and doing nothing for 24 hours, I now can only fetch one single page per IP without being blocked. When I try to scrape the second site, I am asked to enter a captcha again.

    1. Do you have any clue why this happened or suggestions how to solve this problem?

    2. What is your waiting-time per IP between fetching two Google pages in order to not get blocked?

    3. Do you think it helps to use https to connect to Google?

    Thank you very much in advance!
     
    Last edited: Mar 24, 2014
  2. JasonS

    JasonS Jr. VIP Jr. VIP

    Joined:
    Sep 15, 2012
    Messages:
    2,988
    Likes Received:
    914
    Home Page:
    I'm not sure about the time, as I always use public proxies for scraping and semi dedi proxies for posting purpose.
    Benefit of using the public proxies for scraping is you can change them anytime after getting blocked by Google.
    Use some proxy scraping and testing that are available free on the web.
     
  3. davids355

    davids355 Jr. VIP Jr. VIP Premium Member

    Joined:
    Apr 25, 2011
    Messages:
    8,802
    Likes Received:
    6,371
    Home Page:
    Just use proxies.
     
  4. olfactorylobbez

    olfactorylobbez Newbie

    Joined:
    Mar 13, 2014
    Messages:
    12
    Likes Received:
    0
    What are you using to scrape? I am not sure I understand you well but have you tried using semi dedicated or dedicated private proxies?