1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Which is the best Free tool for this task?

Discussion in 'Black Hat SEO Tools' started by neo3187, Aug 1, 2013.

  1. neo3187

    neo3187 Senior Member

    Joined:
    Sep 19, 2010
    Messages:
    868
    Likes Received:
    360
    Can anybody suggest a free tool in which I can put a list of URLs and it visits each URL ( Multithreaded or Singlethreaded ) and I can fix a specific area on a page from which it will scrape content from, for example, scrape a list/text below a specific heading, and store all that scraped content in an Excelsheet or a simple text file?

    Thanks in advance.
     
  2. Daxda

    Daxda Newbie

    Joined:
    May 8, 2013
    Messages:
    10
    Likes Received:
    4
    Occupation:
    Developer
    Location:
    In the deeps of the web
    Home Page:
    Hey there, mind telling me which site or what kind of text it should scrape?
    I might be able to help you and create a quick tool for you :)
     
  3. Skyebug77

    Skyebug77 Jr. VIP Jr. VIP Premium Member

    Joined:
    Mar 22, 2012
    Messages:
    1,471
    Likes Received:
    1,027
    Occupation:
    Marketing
    Location:
    Portland,Or
    You will have to script different types of sites depending on what content you are after. If it is just business info that you are looking for their are all type of scrape tools in this forum-Just a Look DOWN at sig
     
  4. neo3187

    neo3187 Senior Member

    Joined:
    Sep 19, 2010
    Messages:
    868
    Likes Received:
    360
    No the style and format will be the same... say there is a certain footprint that I am looking for, so even the HTML will be same if that helps you guys in suggesting a good software.

    For example,

    Scrape the <ul> items ( in text format seperated by a new line ) under the heading Recent Posts - So the format will be same on every page, that is how il be getting a list of URLs using SB
     
  5. Mnet17

    Mnet17 Junior Member

    Joined:
    Feb 28, 2013
    Messages:
    192
    Likes Received:
    135
    Location:
    Philadelphia
    Go to iMacros.com and learn how to scrape the content yourself. It's a free tool
     
  6. tuannguyen

    tuannguyen Junior Member

    Joined:
    Apr 25, 2012
    Messages:
    149
    Likes Received:
    26
    AutoIT can do that, easily way is reading the HTML code and extract information between 2 string :)
     
    • Thanks Thanks x 1
  7. neo3187

    neo3187 Senior Member

    Joined:
    Sep 19, 2010
    Messages:
    868
    Likes Received:
    360
    AutoIT is a scripting language right? Or do they have such tools also?
     
  8. qlwik

    qlwik Newbie

    Joined:
    Aug 18, 2011
    Messages:
    48
    Likes Received:
    9
    Zennoposter, not free but cheap.
     
  9. Conor

    Conor Jr. VIP Jr. VIP

    Joined:
    Nov 7, 2012
    Messages:
    3,363
    Likes Received:
    5,425
    Gender:
    Male
    Location:
    South Africa
    Home Page:
  10. hpv222

    hpv222 Power Member

    Joined:
    Feb 8, 2010
    Messages:
    736
    Likes Received:
    274
    Hmm, I seriously doubt that you will find off the shlef software (free or paid) that can do this for you .... unless you build or buy a custom one. A possible workaround would be do use htttrack to download all the pages (only pages, no images, etc.) and then use an advanced find and replace software to extract the text (data) that you are after
     
  11. neo3187

    neo3187 Senior Member

    Joined:
    Sep 19, 2010
    Messages:
    868
    Likes Received:
    360
    I am not looking to scrape google, this is what il do

    Scrape for URLs using SB --> Load those urls in some tool --> Extract a fixed pattern of content from each url( each webpage )