1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

How do you get all the links from one domain? (in SB?)

Discussion in 'Black Hat SEO' started by onething1, Jan 9, 2011.

  1. onething1

    onething1 Junior Member

    Joined:
    Sep 10, 2010
    Messages:
    116
    Likes Received:
    2
    Say you trim to root in scrapebox. The link extractor addon only gives the links that go outwards from and towards that link. So this is not what I'm looking for. I just want all the urls that are under that domain. How, dear Lord, how does one do this? thx in advance for feedback
     
  2. kaidoristm

    kaidoristm Power Member

    Joined:
    Feb 13, 2009
    Messages:
    561
    Likes Received:
    726
    Occupation:
    Freelancer
    Location:
    Estonia
    Home Page:
    You have two options. My favorite is to trim domains to root and add prefix (sitemap) to them. For wordpress and movabletype use prefix "/sitemap.xml" and for blogengine "/sitemap.axd" and use sitemap scraper.
    Another option is to harvest inbound links with deep crawl but this is less effective.
     
  3. Bryan

    Bryan Power Member

    Joined:
    Aug 25, 2009
    Messages:
    565
    Likes Received:
    292
    you could try harvest site:domain.com and get all links from there, but doesnt link extractor do inbound links too? (select internal instead of external) im not sure though, ive never tried it