1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Scrapebox Monster Lists

Discussion in 'Black Hat SEO' started by rush7, May 6, 2011.

  1. rush7

    rush7 Junior Member

    Joined:
    Apr 1, 2010
    Messages:
    117
    Likes Received:
    46
    Ok, so I have been building auto approve lists for a few months, some of my own stuff, some from personal shares and forums. I've got it filtered down to 800k unique domains. I run link checker after all posts and only save the ones where the links have been found, thus creating a true auto approve list.

    So here's my question. I have a nice big list of auto approve blogs that have actually been successful for me personally. They are all unique domains. It is reasonable to assume that if I could scrape other post pages from these domains then I will have a truly enormous monster list of auto approve domains. The question is, how do I do that?

    How do I put in a big list of unique domain urls and get scrapebox to find more pages to post to on these domains?

    I know there are some real scrapebox pros here, and I am certain that I'm not the first person to think of this so if you know how this is done, help a brother out.
     
  2. uditbhansali

    uditbhansali Regular Member

    Joined:
    Aug 16, 2010
    Messages:
    486
    Likes Received:
    283
    Occupation:
    Ask your mom
    Use scrapebox addon.
    Click on addons and download scrapebox link extractor and use it..
    It will scrape all the links from the blog.
     
  3. rush7

    rush7 Junior Member

    Joined:
    Apr 1, 2010
    Messages:
    117
    Likes Received:
    46
    sweet! thanks! I knew there had to be an easy way
     
  4. xthoms

    xthoms Regular Member

    Joined:
    Sep 14, 2010
    Messages:
    280
    Likes Received:
    99
    uditbhansali.. You're wrong

    If you share the list with me, I'll tell you exactly how to do it. No BS. Shoot me a PM. It is dead simple and you'll smack yourself in the face saying, "Why did I not think of that"
     
  5. Mokodoki

    Mokodoki Regular Member

    Joined:
    Feb 26, 2011
    Messages:
    217
    Likes Received:
    354
    Occupation:
    Graphic Artist | Fulltime Student
    Use the site: operator. Load it as a footprint, merge it with your AA list (you use this as your keywords), and harvest!
     
    • Thanks Thanks x 1
    Last edited: May 6, 2011
  6. FredDuggan

    FredDuggan Junior Member

    Joined:
    Mar 11, 2010
    Messages:
    140
    Likes Received:
    19
    From Scrapebox Forum

    zimsabre
    Newbie

    Posts: 5
    Joined: Nov 2010
    Reputation: 0
    RE: [SB] Difference between Backlink checker and Link extractor addons?
    use the Link extractor to extract links from a heavily spammed blog, then run the Backlink checker against the extracted links to find where else the spammers are posting. you might want to then run the Blog checker and malware filter addons.


    Wanted to give credit to the original source.
     
    • Thanks Thanks x 1
  7. syedtaha

    syedtaha Registered Member

    Joined:
    Apr 19, 2009
    Messages:
    95
    Likes Received:
    16
    Just did that and found 3k links from a single url.. nice stuff.
     
  8. vasilicaciortan

    vasilicaciortan Power Member

    Joined:
    Mar 17, 2010
    Messages:
    759
    Likes Received:
    433
    1.Load your list into the Harvested URLs box.
    2.Trim to root.
    3.Remove duplicates.
    4.Save the list.
    5.Go to Tools->Scrapebox Text Editor.
    6.Go to File->Load from File and load into the text editor your previously saved list.
    7.Hit Ctrl+H.
    8.Type http:// into the "Find what" field and site: into the "Replace with" field. Click replace all.
    9.Go to File->Save to keyword list.
    10.Close the Scrapebox Text Editor. Now click start harvesting.

    This way you will have all the URLs from your autoapprove domains. Now filter them, post to them, check links and save the ones that are autoapprove. You should end up with a really nice list. GL.
     
    • Thanks Thanks x 1
  9. DamnedFreak

    DamnedFreak Junior Member

    Joined:
    Feb 12, 2010
    Messages:
    129
    Likes Received:
    14
    That's the way to go.

    What Fred is advising is good to get more sources too, however it's not what the OP asked.
     
    • Thanks Thanks x 1
  10. xthoms

    xthoms Regular Member

    Joined:
    Sep 14, 2010
    Messages:
    280
    Likes Received:
    99
    Easier to just trim them, import them to KW field and in the footprint field write
    site:%keyword%
     
    • Thanks Thanks x 2
  11. ppyordanov

    ppyordanov Newbie

    Joined:
    Feb 6, 2010
    Messages:
    26
    Likes Received:
    3
    This thread is really useful, thank you. Getting scrapebox on monday so this was helpful.