1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

How to extract all pages from website

Discussion in 'Black Hat SEO' started by Adimof, Dec 10, 2010.

  1. Adimof

    Adimof Junior Member

    Joined:
    Feb 13, 2009
    Messages:
    165
    Likes Received:
    11
    Hey there, I want to extract all pages (not links, pages) from a specific url.

    Can someone please help me?
     
  2. xrfanatic

    xrfanatic Jr. VIP Jr. VIP

    Joined:
    Aug 28, 2010
    Messages:
    387
    Likes Received:
    172
    Location:
    http://bit.ly/slb64
    Home Page:
    You want to save all the pages to your hard disk ? Then use WinHTTrack , free software for scraping websites. Hope it helps.
     
  3. Adimof

    Adimof Junior Member

    Joined:
    Feb 13, 2009
    Messages:
    165
    Likes Received:
    11
    no, not saving them. just export in .txt
     
  4. pintonbd

    pintonbd Power Member

    Joined:
    Feb 12, 2008
    Messages:
    619
    Likes Received:
    186
    Occupation:
    Banker
    WinHTTrack is good choice .. can also use IDM there is a grabber option installed with this download manager.
     
  5. ulijonroth

    ulijonroth Regular Member

    Joined:
    May 6, 2010
    Messages:
    205
    Likes Received:
    769
    Occupation:
    If you work for a living, why do you kill yourself
    Location:
    localhost
    I use Offline Explorer Enterprise, and prefer it to the others, but it is not a free choice.
     
  6. Sonat

    Sonat Junior Member

    Joined:
    Sep 15, 2009
    Messages:
    116
    Likes Received:
    553
    Occupation:
    Try to be an good IM
    Location:
    International Area
    HTTrack is the best, but be careful to remove footprints in the source code if you want to use scraped pages