1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

[SB]How to harvest all the pages of a site?

Discussion in 'Black Hat SEO Tools' started by omoxx, Dec 29, 2010.

  1. omoxx

    omoxx Regular Member

    Joined:
    Feb 21, 2008
    Messages:
    346
    Likes Received:
    18
  2. manudevil20

    manudevil20 Power Member

    Joined:
    Mar 28, 2008
    Messages:
    695
    Likes Received:
    278
    Location:
    Idaho
    For the footprint just use site:domain.com in the harvester.
     
    • Thanks Thanks x 2
  3. loopline

    loopline Jr. VIP Jr. VIP

    Joined:
    Jan 25, 2009
    Messages:
    3,527
    Likes Received:
    1,878
    Gender:
    Male
    Home Page:
    use it for all the engines you can, as each will give different results, but if it has several thousand pages or more you only going to get all of them with an actual crawler.
     
  4. carlito

    carlito BANNED BANNED

    Joined:
    Aug 22, 2010
    Messages:
    1,153
    Likes Received:
    326
    with the command "site" you wil only be able to scrape indexed pages.
    To scrape all the pages, use the sitemap addon, it is very powerful (if the siteweb has a sitemap)