1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

[HELP] How to harvest all url's from a domain using SB??

Discussion in 'Black Hat SEO' started by rodnice1, Mar 6, 2012.

  1. rodnice1

    rodnice1 Regular Member

    Joined:
    Oct 1, 2011
    Messages:
    283
    Likes Received:
    25
    Occupation:
    business student/business man
    Location:
    East Coast USA
    Can anyone tell me how to use Scrapebox to scrape all url's off of a domain? Example, to get each and every url on my doctor's website into a list...
     
  2. rodnice1

    rodnice1 Regular Member

    Joined:
    Oct 1, 2011
    Messages:
    283
    Likes Received:
    25
    Occupation:
    business student/business man
    Location:
    East Coast USA
    anybody...?
     
  3. davids355

    davids355 Jr. VIP Jr. VIP

    Joined:
    Apr 25, 2011
    Messages:
    10,183
    Likes Received:
    7,827
    Home Page:
    Think you can do footprint site:yourdomain.com will give you all indexed pages, or I beleive there is a sitemap adding for scrapebox which I imagine would do what your after.

    If you get stuck you can always use XML-sitemaps.com which will produced a plain text URL list:)
     
    • Thanks Thanks x 1