Hi everyone, Building web-based applications is what I do and auto-scaling scrapers that operate in the cloud are a bit of my thing Through this forum I was introduced to the concept of footprint scraping to feed into software like Xrumer and GSA - maybe more. Stuff like this: http://www.blackhatworld.com/blackh...sa-ud-senuke-scrapebox-footprints-1300-a.html You pipe in the footprints and keywords, and off the scraper goes. I've already got a working Google scraper and it wouldn't be too much work to adapt this into a cloud based footprint scraper. By my early estimations it should be able to hit 288.000 URLs / hour. You as a user would be able to enter keyword lists and choose from a set of predefined footprints, or input your own. The list of URLs would be available for download some time after. The thing is though: Would you be interested? Please reply and let me know what kind of features you'd like, and what you'd be willing to pay for a subscription. You'll have no additional (proxy, captcha etc.) costs, no software to download. What would such a service be worth to you? Given enough feedback, this thing just might see the light of day!