1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

[GET] script for sorting your fresh (Huge) backlinklists for priority

Discussion in 'Black Hat SEO Tools' started by load8, Dec 15, 2009.

  1. load8

    load8 Newbie

    Joined:
    Oct 21, 2009
    Messages:
    25
    Likes Received:
    7
    Keep on reading if you use Scrapebox or a Pagerankchecker with a similar Output ;)

    So I had the following problem:

    For example you xrumered your site or linkwheel or you created thousands of comments (or scraped a bunch of blog urls to comment) and you have a list with a few thousand urls like this:

    pagelist.txt:
    Code:
    www.someurl1.com/somepage/somepage...html
    www.someurl2.com/somepage/somepage...html
    www.someurl3.com/somepage/somepage...html
    www.someurl4.com/somepage/somepage...html
    ...
    
    Now you'll have to start making google index those links by pinging/bookmarking/commenting etcetc But where to start?
    Of course you will want to start with the urls that will have the biggest effect first.

    So you'll do a pagerank check.. but most of the pages(especially if they are xrumered profiles) won't have a pagerank. So you trim the URLs to root domain and check pagerank for those to see which domains are the strongest.

    Im not sure how many other pagerankcheckers have this output.
    I use Scrapebox(awesome tool, thx sweetfunny ;)) which will then give me an output like this:

    Domanlist.txt:
    Code:
    www.someurl1.com|5
    www.someurl2.com|6
    www.someurl3.com|0
    www.someurl4.com|2
    ...
    
    So now I know the pagerank of the domains I have backlinks on on some deeplink.
    If you are working with a small list, you dont need a script to take the high PR domains, search for them manually in your profiles/blog/page-list to find out which of your backlinks are the strongest.
    But if you are looking at a list with thousands of links you need to automate that task;
    which is why I wrote this lil vbs script:

    You use it like this:

    PRsort.vbs <domainoutput> <deeplinkoutput> <minumum-Pagerank> <pageinput> <domaininput>

    <domainoutput> All domains with the pageranks you selected will go here
    <deeplinkoutput> All Pages with the Root Domain-pageranks you selected will go here

    <minumum-Pagerank>
    This is the minimum pagerank you want the output pages to have (so if you put "2"; your output will only contain pages with Root Domain-pageranks of 3 - 9)

    <pageinput> pagelist.txt
    <domaininput> Domanlist.txt

    So If I wanted to use it on the above example txt's I would go to the commandline and type:

    >PRsort.vbs domainout pageout 4 pagelist domainlist

    this will create 2 Files:

    pageout.txt
    Code:
    www.someurl1.com/somepage/somepage...html
    www.someurl2.com/somepage/somepage...html
    ...
    
    domainoutput.txt
    Code:
    www.someurl1.com|5
    www.someurl2.com|6
    ...
    
    Never use the ".txt" suffix in the command line on this tool

    Im pretty sure there Is a better way to solve this.
    This is the quick&dirty solution I coded for myself though and I thought I shall share due to the immense awesomeness of this forum ;D
     

    Attached Files:

    • Thanks Thanks x 1
  2. johney

    johney Registered Member

    Joined:
    Oct 18, 2009
    Messages:
    81
    Likes Received:
    6
    nice but i just use the sort function of excel :D
     
  3. load8

    load8 Newbie

    Joined:
    Oct 21, 2009
    Messages:
    25
    Likes Received:
    7
    Will it take your list of checked domains, scan for each of them in your deeplinklist and create a list of pages that are on high PR domains only ?