SaaS / Cloudbased Google Footprint Scraper: 288.000 URLs / Hour

DDuce

Regular Member
Joined
Nov 11, 2011
Messages
277
Reaction score
107
Hi everyone,

Building web-based applications is what I do and auto-scaling scrapers that operate in the cloud are a bit of my thing :) Through this forum I was introduced to the concept of footprint scraping to feed into software like Xrumer and GSA - maybe more.

Stuff like this: http://www.blackhatworld.com/blackh...sa-ud-senuke-scrapebox-footprints-1300-a.html

You pipe in the footprints and keywords, and off the scraper goes. I've already got a working Google scraper and it wouldn't be too much work to adapt this into a cloud based footprint scraper. By my early estimations it should be able to hit 288.000 URLs / hour. You as a user would be able to enter keyword lists and choose from a set of predefined footprints, or input your own. The list of URLs would be available for download some time after.

The thing is though:

Would you be interested?

Please reply and let me know what kind of features you'd like, and what you'd be willing to pay for a subscription. You'll have no additional (proxy, captcha etc.) costs, no software to download. What would such a service be worth to you? Given enough feedback, this thing just might see the light of day!
 
so which one you recommended for scraping cloudscraper or another one
 
yes, i was using banditscraper and loved it, but it died. take my money!
 
Thanks! BanditScraper was indeed one I saw and read the thread off. It's weird that they just died like that. Not a good sign if you ask me though ;-) Hence the thread! If anything their lead is probably one not to follow in terms of pricing :p
 
Back
Top