Google detect footprints and ban proxies. How do you scrape these days?

Penumbra

Senior Member
Apr 23, 2014
835
246
I tried first with gsa ser footprints, then tried some custom ones, but scrapebox wont scrape at all, google detects it and block it. I tried elite google passed proxies, then 250 backconnect proxies that change every 10 minutes but still same, what you are doing to avoid this problem, and problem is without doubt footprints.
 
You may try to lower your scraping threads or increase the number of proxies for your thread. I haven't used private proxies for a while now for scraping but I saw people recommending a bunch of proxies per thread. Some proxy providers even state that their proxies aren't for scraping anymore, you may check the small print in their ToS page.
 
Scraping sure is tougher than it used to be so I'd love to hear feedback on what others are doing that works.

Personally I've got a few different VPS's and have my delay's set very high and run it nearly around the clock without problems. The main issue though is it's not producing quick results.
 
Google blocks IPs temporary and only if it detects spammy activities, may be you can try virgin fully dedicated proxies.
 
That is really bad news as I wanted to ask about good proxy footprints for ahref for 2015 :/ (I have to start doing some SEO again and after 3 years break).

What kind of waiting time are we talking about (between loading next result page for IP) ? 10-90 seconds ?
 
Back
Top
AdBlock Detected

We get it, advertisements are annoying!

Sure, ad-blocking software does a great job at blocking ads, but it also blocks useful features and essential functions on BlackHatWorld and other forums. These functions are unrelated to ads, such as internal links and images. For the best site experience please disable your AdBlocker.

I've Disabled AdBlock