Xrumer Report Scanner

Discussion in 'Black Hat SEO Tools' started by Reefer, Jan 9, 2010.

  1. Reefer

    Reefer Junior Member

    Joined:
    Jan 3, 2010
    Messages:
    197
    Likes Received:
    72
    Occupation:
    BALLIN
    Location:
    no lie
    If you bought a Xrumer blast from someone and want to check out how many profile links in the report are actually seo friendly, so I made a few php scripts.

    What will happen:
    Get rid of all duplicate entries in the report text file
    check each link in report file to find a specific text (domain name usually)
    save all found links into separate text file

    What You Need:
    Hosting with no timeout limit

    NOTE: Backup your report, just in case.

    1 - save below code to removedupes.php and change PROFILESLIST.txt with the name of the report file (3rd to last line)
    PHP:
    <?php
    /**
     * RemoveDuplicatedLines
     * This function removes all duplicated lines of the given text file.
     *
     * @param     string
     * @param     bool
     * @return    string
     */
    function RemoveDuplicatedLines($Filepath$IgnoreCase=false$NewLine="\n"){
     
        if (!
    file_exists($Filepath)){
            
    $ErrorMsg  'RemoveDuplicatedLines error: ';
            
    $ErrorMsg .= 'The given file ' $Filepath ' does not exist!';
            die(
    $ErrorMsg);
        }
     
        
    $Content file_get_contents($Filepath);
     
        
    $Content RemoveDuplicatedLinesByString($Content$IgnoreCase$NewLine);
     
        
    // Is the file writeable?
        
    if (!is_writeable($Filepath)){
            
    $ErrorMsg  'RemoveDuplicatedLines error: ';
            
    $ErrorMsg .= 'The given file ' $Filepath ' is not writeable!';    
            die(
    $ErrorMsg);
        }
     
        
    // Write the new file
        
    $FileResource fopen($Filepath'w+');      
        
    fwrite($FileResource$Content);        
        
    fclose($FileResource);   
    }
     
     
    /**
     * RemoveDuplicatedLinesByString
     * This function removes all duplicated lines of the given string.
     *
     * @param     string
     * @param     bool
     * @return    string
     */
    function RemoveDuplicatedLinesByString($Lines$IgnoreCase=false$NewLine="\n"){
     
        if (
    is_array($Lines))
            
    $Lines implode($NewLine$Lines);
     
        
    $Lines explode($NewLine$Lines);
     
        
    $LineArray = array();
     
        
    $Duplicates 0;
     
        
    // Go trough all lines of the given file
        
    for ($Line=0$Line count($Lines); $Line++){
     
            
    // Trim whitespace for the current line
            
    $CurrentLine trim($Lines[$Line]);
     
            
    // Skip empty lines
            
    if ($CurrentLine == '')
                continue;
     
            
    // Use the line contents as array key
            
    $LineKey $CurrentLine;
     
            if (
    $IgnoreCase)
                
    $LineKey strtolower($LineKey);
     
            
    // Check if the array key already exists,
            // if not add it otherwise increase the counter
            
    if (!isset($LineArray[$LineKey]))
                
    $LineArray[$LineKey] = $CurrentLine;        
            else                
                
    $Duplicates++;
        }
     
        
    // Sort the array
        
    asort($LineArray);
     
        
    // Return how many lines got removed
        
    return implode($NewLinearray_values($LineArray));    
    }
    $lns RemoveDuplicatedLines('PROFILESLIST.txt'true);
    echo 
    "Removed duplicates";
    ?>
    2 - save below code to xrumerscan.php
    PHP:
    <?php
    if(isset($_GET['u'])){
    $u $_GET['u'];
    $f $_GET['f'];
    $t $_GET['t'];
    // get contents of a file into a string
    $filename "$f.txt";
    $handle fopen($filename"r");
    $contents fread($handlefilesize($filename));
    fclose($handle);
    $links explode("\n",$contents);
    $i=0;
    $list "";
    foreach(
    $links as $link){
    $html = @file_get_contents($link);
    $pos strpos($html$u);
    if (
    $pos === false) {

    }else{
    echo 
    $link "<br>";
    $list .= $link "\n";
    }

    }
    $ourFileName "$t.txt";
    $ourFileHandle fopen($ourFileName'w') or die("can't open file");
    fwrite($ourFileHandle$list);

    fclose($ourFileHandle);

    }else{
    ?>
    Url format: xrumerscan.php?u=texttosearch&f=profilefile&t=profileswithlinks<br>
    Example:<br>
    Report = profiles.txt (place in same folder as xrumscan.php)<br>
    Website = chuckycheese.com<br>
    Text File = Name of file to be created with all the seo approved links<br><br>

    Enter into address bar: xrumerscan.php?u=chuckcheese&f=profiles&t=seolinks<br><br>
    everything is case sensitive. can be run as cron.
    <?
    }
    ?>
    3 - upload xrumerscan.php, removedupes.php and xrumer report txt file to same folder
    4 - make sure the folder is writable (chmod it)
    5 - run removedupes.php and check the report file to make sure its been changed (less links, abc order)
    6 - run xrumerscan.php and read the instructions
    6b - you can also make it a cron job (after you read instructions) which means you wont have to idle the page.

    My example: I ordered a 200k blast from here and here are the changes:
    total blast - 200,000
    original xrumer report - 26,964 links
    remove duplicates - 9,821 links
    seo check (final) - 3,074 links

    I don't have xrumer but I believe you can use your successful list in the future to save time and resources.

    Enjoy.
     
    • Thanks Thanks x 5
  2. d3t0x

    d3t0x Jr. VIP Jr. VIP

    Joined:
    Oct 28, 2008
    Messages:
    2,123
    Likes Received:
    812
    Location:
    Vancouver, BC
    hey nice script bro, thanks given
     
  3. iglow

    iglow Elite Member

    Joined:
    Feb 20, 2009
    Messages:
    2,079
    Likes Received:
    861
    Home Page:
    i got such stuff but still i give u thanx ;)
     
  4. black_hat_newbie

    black_hat_newbie Newbie

    Joined:
    Jan 12, 2009
    Messages:
    28
    Likes Received:
    1
    Occupation:
    IM & SEO
    Location:
    Internet
    Thanks for sharing the script. The removedup.php script executes properly and gives the message "Removed duplicates" but when I run xrumerscan.php on my server, the script runs for a while then I get a 500 internal error. Does this mean i do not have hosting with no timeout?
     
    Last edited: Jan 31, 2010
  5. SEODEMON

    SEODEMON Regular Member

    Joined:
    Sep 15, 2010
    Messages:
    244
    Likes Received:
    76
    I need this so bad but I can not get it to work! Please help.

    Can not PM yet!