1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Custom footprint to harvest all the pages of a given website?

Discussion in 'White Hat SEO' started by omoxx, Sep 4, 2010.

  1. omoxx

    omoxx Regular Member

    Joined:
    Feb 21, 2008
    Messages:
    346
    Likes Received:
    18
    Is there a custom footprint to harvest all the pages of a given website?
    Thank you
     
  2. bezopravin

    bezopravin BANNED BANNED

    Joined:
    May 11, 2010
    Messages:
    461
    Likes Received:
    3,471
    Try Sitemap Scraper Add-On to harvest all pages on Given Website.
     
  3. omoxx

    omoxx Regular Member

    Joined:
    Feb 21, 2008
    Messages:
    346
    Likes Received:
    18
    Add on to what?
    Thanxxx
     
  4. royalmice

    royalmice BANNED BANNED

    Joined:
    Aug 23, 2007
    Messages:
    1,186
    Likes Received:
    983
    Not sure if it is what you are looking for but i recentely posted a extensive list of footprints over here : http://www.blackhatworld.com/blackhat-seo/black-hat-seo-tools/232492-usefull-lost-footprints-scrapping-urls.html
     
  5. bakxos

    bakxos Regular Member

    Joined:
    Aug 8, 2010
    Messages:
    498
    Likes Received:
    292
    Location:
    Scotland
    Code:
    site:[URL="http://www.site.com/"]www.site.com[/URL]
     
  6. bezopravin

    bezopravin BANNED BANNED

    Joined:
    May 11, 2010
    Messages:
    461
    Likes Received:
    3,471
     
    • Thanks Thanks x 1
  7. omoxx

    omoxx Regular Member

    Joined:
    Feb 21, 2008
    Messages:
    346
    Likes Received:
    18
    I enter "site:mysite.com" in the harvester browser, > Custom footprint>Yahoo .

    Then I enter "mysite.com" in the KWs window>Start Harvesting.



    "Automatically Remove Duplicate Domains" checked? If so, uncheck it.
     
  8. trakker

    trakker Newbie

    Joined:
    Jan 5, 2009
    Messages:
    42
    Likes Received:
    2
    Sorry for a dumb question, but this scrapebox add-on just harvests the URL of a given site, correct?

    Is there anything that scrapes a site or blogs posts content?