What is the best Mass PR Checking Software & Mass Pinging Software?

Discussion in 'BlackHat Lounge' started by agentk007, Dec 11, 2009.

  1. agentk007

    agentk007 Junior Member

    Joined:
    Aug 16, 2008
    Messages:
    146
    Likes Received:
    45
    Occupation:
    Internet Marketing & SEO Services
    Location:
    Hoi An, Vietnam
    Home Page:
    Hi,

    I am in need of a software (or two) and am wondering what you would recommend. Here is what I need:

    #1.) I am a xrumer service provider and I submit my customers websites everyday for 30, and I want be able to ping all of the forum accounts created. I only create 25 - 50 backlinks per customer site per day, so I am not too worried about appearing not natural to the search engines. The URLs are all stored in a .txt file and I am wondering if there is a way I can import the the file in to something that will ping them for me (preferably with proxies) or is there any place I can copy paste the URLs in to and have the software ping every URL?

    I short, what would be my best approach for to get thousands upon thousands of URLs pinged from different IP adresses?

    #2.) I have .txt files with millions of forum URLs and I want to sort all those URLs by PR (page rank). What would you recommend?
    These URLs ARE in a .txt file as well!

    Any thoughts or Ideas?
     
    Last edited: Dec 11, 2009
  2. thachcaodotcom

    thachcaodotcom Junior Member

    Joined:
    Oct 21, 2009
    Messages:
    121
    Likes Received:
    151
     
    Last edited: Dec 11, 2009
  3. agentk007

    agentk007 Junior Member

    Joined:
    Aug 16, 2008
    Messages:
    146
    Likes Received:
    45
    Occupation:
    Internet Marketing & SEO Services
    Location:
    Hoi An, Vietnam
    Home Page:
    Good advice!

    I tried traffic travis but it did not work out for me. Any other softwares I could use to find out the rank of each site? I tried a few paid ones and they kept crashing on me because the link file is too big. It has 1.5 million freshly harvested URLs and I tried breaking it up in to smaller .txt files, but had no luck.

    Any other ideas?