I saw that google is more aggressive with the bans in the last period. I'm paranoid or you saw it too?
I saw that google is more aggressive with the bans in the last period. I'm paranoid or you saw it too?
Well you should use private proxies instead. Public ones usually get banned fast. However, getting "google clean" proxy is hard these days... :/
lol mazgalici is offering proxies so he talks from experience
and yes mate, they are getting banned faster indeed, it sucks!
ive been port scanning public proxies for 6 yrs.
3yrs on bhw. and i can still get aprox 3k google
passed proxies a day to support 600 subs on my
service. my point ?
public scraped proxies suck, because everyone else can scrape the sameWell you should use private proxies instead. Public ones usually get banned fast.
proxies = why the get banned so fast because your all using proxies from the same pool
port scanning public proxies is searching out piles of unlisted public proxies
that google isnt tracking. if it didnt work, explain on a service of 600+ users
using public proxies i port scan - ive never done a single refund on bhw since
i came here in 2008.. and thats because its all about your sources .
ille put this another way..go grab 5k public forum listed proxies and test them.
you might get 500 or so google passed...test them again in about 2/3 hrs
most will be dead, because the same proxies you scraped tested and used,
others are doing it with the same proxies, any list you can scrape, anyone else can
which is why the die and get banned so quick.
i can post to my 600 subs a list of port scanned public of say 1.5k google passed,
and 10 hrs later 3/400 will still be live..why ? you cant scrape port scanned public proxies.
so port scanned public proxies will out last public url scraped proxies 3/1 everytime
Last edited by proxygo; 08-23-2012 at 11:04 AM.
It affects lots of proxy providers. I even got an email from one of our proxy providers today
Any ideas about new request patterns that reduce detection?A number of users have been reporting a higher than normal rate of 403 errors from google. 403 error means google thinks you are a bot. This is a normal error to receive but lately our proxies are receiving it much more often. This issue is widespread across our IPs. We are aware of it and taking measures to try to fix it. Please trust that we are doing the best we can (ordering new servers and IPs takes time).
It seems that recently Google increased its enforcement of scraping and they changed the limits some how. So behavior that did not previously cause problems is causing individual IP addresses, sometimes entire blocks of IP addresses, to be banned. It may be wise to cut down your rate of scraping and increase the velocity with which you switch IPs. It may also be worth it to ping Scrapebox support about this and ask them to change the request rates.
Unfortunately, this problem has affected multiple proxy providers so it definitely seems like a change that google made. We are doing as much as we can on our end: we are working to decrease the number of users per proxy, limit overuse by individual users, and increase speed.
Until then, thank you very much for your patience. Please rest assured we are aware of the issue and working on it.
I'm working with "YoTub" and I only use private proxies from different providers, and they started to ban some of my accounts. At first I thought it was the accounts but after I've used a different ones same thing happened. Then I've found a pattern, when I was using the proxies and I reached the 4th IP from the same class all the accounts were banned, if I only use 3 IP's from the same ip class the accounts are ok. Did anyone else experienced this ?
Exacly, I saw it as well.
are you by change using inurl in your queries? The rate depends on the keyword you are scraping. Sometimes my prooxies get me nothing in scrpebox, so i change the keyword and voila, it works
I notice it too (more 403 error messages) from my specialized SEO tools as well as the public proxies. It means we need to spread out the proxies (private/public) more often to minimize the 403 error issue.
no problems here still getting the same amount of Google passed proxiesI notice it too (more 403 error messages) from my specialized SEO tools as well as the public proxies.
i always get. 1k in a morning - 1.5k in the evening, no change here
The real issue we are having RIGHT NOW is that Google is banning IPs much quicker than before as stated by OP, which I saw it from the testing we have with my specialized tools.
We are studying on finding the ways to get around the Google 403 error ban by experimenting with different access methods via user agents, referrers, frequent timed IP rotations, etc.
Until then, the proxy users will need much more proxies (10-100x more) in order to access to Google. It means more $$ for users to shell out to proxy providers.
also maybe consider of mixing things up a little maybe try getting socks proxies
as well for scrapebox as well as HTTP, there less used. did a little test on that
this morning ....even without port scanning - because i dont port scan socks,
only HTTP proxies , i was able to throw in and get working without really trying
around 200 socks proxies working on scrapebox, aprox 190 where anonymous
and around 50 Google passed ..
Since last week Google has been monitoring aggressively for proxy scrapping .. You certainly will need lot more subnets + IPs in order to scrape here on...
well since i provide proxies to 600+ users of bhw
from donors/vips/excs/3 mods, not 1 person has contacted
me and said there results have fallen or the length of time the
proxies last for has fallen .. so nothing to report here..
i am sorry to see your having a few issues with google with your
proxies and i hope you can resolve it, but as of yet, until any of
my subs say otherwise i can only assume everything is ok
but to find out i posted a message to my subs and ille let you no
what they say, as my service is also targeted for scraping . ille
let you no what they say..
[341 PM] tony: quick question guys, i see a few proxy sellers on bhw
saying that Google is becoming more aggressive blocking there proxies
they provide for users to scrape with, can anyone confirm if the proxies
i am providing are working better/worse or is everything working the same as usual..
Last edited by proxygo; 08-25-2012 at 01:05 PM.
I have been reading multiple explanations on your behalf and I believe you lol, port scanning to further find ports. I have been wasting so much time on finding proxies than on my url research lol. I would absolutely purchase your service and use NiX but hell, can't afford a single domain as of yet :/
About Google banning, I did notice something, yes. Forum posted scrapes from fellow members also appear to drain like lightning quick too. Proxies are the bases of SEO research.
I have been getting like 40% success from my custom lists before it was 60%. But they do work for a few hours now.
We ran over 400 tests in the last 48 hours to be able to determine the pattern. Yes, G00gle has the stronger bot/proxy detection right now based on various access types (search, special commands, etc.).
Results: (Cliff notes version)
- My "test" private proxies are PERM-blocked by G00gle after using the special commands 3 times or more. When it happens (failed captcha entry), that IP is BLOCKED for good, only with a link to contact G00gle if you want to request to unblock IP (it will be very difficult). It happens if you have the same subnet IP block for the test. Fortunately these test IPs I have are junk anyway.
- My test "G00gle" proxies from proxylist.co/webproxylist.com have less chance of blocking due to the real mix of subnets (almost all of them are unique) when running the G00gle commands/basic searches. In other words, better results, due to large number of tested G00gle proxies from my system available.
It looks like the private proxy providers need to raise $$$ to charge the customers for the private proxies, plus, they need to require the customer to order large # of proxies, and would need to change the way they operate the business in order to protect their IPv4 private proxies. If the customer caused these IPs to be banned by G00gle, they (not private proxy providers) need to pay for the replacement costs. If the private proxy providers don't, they will be digging their own grave big time due to the dead/useless IPs.
Last edited by portalweb; 08-26-2012 at 11:11 AM.
google will always change things - its just upto us
sellers to keep pace with them an adapt
I guess this was just a start.
I will testify that yes, it is much more aggressive now. I use freshest proxies yet (not port scanned premiums) and hell, within minutes their damn gone. It has gotten so bad it's a joke lol. It's depressing the fact that this is what drives our research further and effective, without it we'll be bogged down.
well i spoke to some of my 500 subs - bare in mind for scraping purposes which
is what i base by reply on, the answers i got back rounded up as followed .
now baring in mind with my subs i promises a minimum amount of each
user getting 400 google passed proxies a day + a separate list of proxys
for scraping yahoo , this is the answer i got .
wombat: it IS effecting the amount i get... but the amount is still great.
Clarke: you're doing a great job mate
loopline: yeah I get a good proxy pass rate still, I even grab proxies
from your list a day old or so and test them. Still get good ones from those.
put simple there still getting there minimum 400 google passed proxies per day
everyday, but most average more. even with the 404's i havnt missed a target
in 3 yrs . as of now im still averaging 2.5k a day google passed proxies
and around 1-1.5k google failed but anonymous that will scrape yahoo.
if it stays like this - i can cope .
private proxies are the best option to avoid this
Yes I noticed 403 (payment request) from a few days already, thing I can do is just continue to the next good proxy. And if a subnet is banned because of only 3-4 used proxies this will gonna be hard.
Cheap & Quality Instagram Accounts! (Skype @ profile):
1+ weeks old, fictive email: $0.10 acc | 6+ weeks old, fictive email: $0.12 acc
1+ weeks old, email verified: $0.14 acc | 6+ weeks old, email verified: $0.16 acc
all i no is, yes the 404's have risen
but the results i get haven't changed that much
i still get just under 1k Google passed proxies in the morning
and approx 1.5k in the evening everyday, but, these arnt public
scraped proxies which everyone can use, mine are public port
scanned Google passed proxies, so less people have them,
so the change ive noticed is minimal.
that also doesn't include over 1k Google failed anon passed
that scrape Google fine
We use much more than 1000 private proxies currently, but with the recent changes, I would be happy to try your public proxies as well to get more requests through.
My feeling is that this is related to subnet wide blocking, so with private proxies it really depends also on the other people that use the same subnet.
Please send me more information.
the 3 adverts in my sig - only 1 is mine - the proxy sig part
the other 2 service in my sig belong to 2 other donor members
i have no affiliation to the word list advert or the 500 followers
advert only that i allow them to advertise in my sig
I need to scrape google search results. Mainly I need 20.000 keywords and the first 50 search results (5 pages).
A question to experts (like proxygo): How many public / private proxies do I need to achieve this task?
It should be done in between 1 day, so I have to do 1 URL request in about 1 second.
It would be awesome if this could work 24/7 by the same amount (100.000 requests per day).
Does anybody have a recommendation on how many private / public proxies I have to use
and which strategy to take that no proxies get banned? I think 100.000 requests per day isn't
that much, or am I wrong with this?
Thanks in advance
Last edited by tinkerbellO; 08-31-2012 at 01:21 PM.
I've had a bunch of google bans recently too =(
So for 100,000 requests in 1 day, you would need about 70 proxies from several different class c nets.
However not all proxies are equal and Google has increased blocking based on requests from the same class c net. If you use shared proxies that have a lot of other users, then obviously you will need more to prevent blocking. If you have IPs from a class cm, where a lot of other users get banned then you need more.
We recently had such experience, and Google is fiercely identifying and blocking them these days..
well if this is what they call aggressive Google banning
long may it continue . onwards and upwards
Google Security in place lol
i have came to know that they have all hidemyass proxies ,ips and they are not safe any more.I am using free proxy lists that are fresh and updated often.
the 404 google problem is wide spread through any proxy
service at the moment - all you can do is pay your money
and make a choice .
I noticed the same findings. It's clear we all have to same issues.
Who has the solution?
im still in love with this aggressive Google block man its so bad
i can hardly work - what will i do rofl
10 PM GMT
3 PM GMT
man this google banning is getting worse
not to shabby for 3am