hmmm i wonder why everyone says
cant get many public proxies for scrapebox
today for the first time in a month or so
i picked up 560 google ip passed proxies
looks like things are on the up
I am getting proxies from some "so called private" sources that are quite hidden you must search a lot for them to find them , some aren't in google index.(search other se's) like yandex/bing/yahoo and others.
By the way russian sources are gems
Here is how I usually do it to find other sources , I get the good urls then check on the most used search engines with quotes that url , this way I end up finding more urls(even some private ones)
And then:test the urls one by one , if one url gives me less that 10 proxies I delete it from list
me - i take one working proxie load it in a tool
the tool then googles all sites with that 1 proxie
on it - then the tool leaches from all the urls it found
then do it with 100 proxies.
i wont complaint 3 months ago i was getting
1500-2k per scrape but only for subscribers
symss
checked that list, from that link
want some advice if u use it
remove all 9145
17k of 9415 were all dead
most of the proxies that did work
560 were port 80 - 8080 - 3128
do that u will save 30-40 mins
on the test
totally incorrect thus showing your lack of
scrapebox knowledge .
public proxies are used on scrapebox for harvesting
for 2 reasons.
1 - they can be replenished several times a day in
there hundreds.
2 - public proxies are used for url scraping because
if u take 10 private proxies and harvest with them,
after a few hrs google will block them, then what do u do?
u cant harvest and now u cant post till google releases them.
with public proxies u can collect them several times a day
while your privates only post
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.