1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Scrapebox - sifting and sorting

Discussion in 'Black Hat SEO' started by shudogg, Nov 18, 2010.

  1. shudogg

    shudogg Regular Member

    Joined:
    Sep 23, 2008
    Messages:
    412
    Likes Received:
    153
    Occupation:
    Internet Marketing
    Location:
    Indiana
    Home Page:
    I actually have had some pretty good success with Scrapebox. I love it!

    I am trying to figure out one last thing. How to effectively sort and sift through all the damned URLs I have stored.

    I have jacked thousands of backlinks from other sites. Analyzed them, and put them in separate text files like wordpress.txt, blogengine.txt, ect. I have even made a few auto-approve lists.

    My concern is when you have a list that you can snag 6k backlinks just by loading it up into SB, after a while you will be posting your comment many times on the same exact page of the sites in your list. Increasing the OBL on that page.

    Say for example this URL: site.com/post-name
    It would be awesome to grab "another" page off the same site easily. like site.com/another-post

    Like if I could take every URL in my list and replace them with another post from the same site.

    Maybe someone else has figured out a snazzy little routine for doing this?
     
  2. HelloInsomnia

    HelloInsomnia Jr. Executive VIP Jr. VIP Premium Member

    Joined:
    Mar 1, 2009
    Messages:
    1,816
    Likes Received:
    2,912
    Why don't you just scrape a larger list then split the list into 6k chunks?

    You really don't want to keep commenting on the same sites with the same URL you want to try and get IP diversity.
     
  3. shudogg

    shudogg Regular Member

    Joined:
    Sep 23, 2008
    Messages:
    412
    Likes Received:
    153
    Occupation:
    Internet Marketing
    Location:
    Indiana
    Home Page:
    I imagine people that do large blasts don't scrape every time before they run. I am working on thinking of more keywords to use to scrape so that I can find more and more sites to scrape up. I mix up using keywords, custom footprints, and stealing backlinks from builders.

    I was thinking a way to build a negative list to scrub with. If I have 13k sites I have already used, I could compare two text files together and have it remove the sites I have already ran.

    And I have way more than 6k splits. I have one list with about 60k urls in it. I am just trying to build a fluent method.

    I used to crack porn sites years ago. I could use a software and compare text files. If duplicates were detected, it would remove all instances of them so I had a clean virgin list. However I can not recall the name of the software and I doubt it would work for URLs as it was for password combinations.

    I am just looking for a way to limit myself from spamming on the same sites again and again.
     
  4. HelloInsomnia

    HelloInsomnia Jr. Executive VIP Jr. VIP Premium Member

    Joined:
    Mar 1, 2009
    Messages:
    1,816
    Likes Received:
    2,912
    Try going to Google and using this search operator you can get thousands of random words then do a massive scrape of millions of blogs.

    Code:
    inurl:.txt words