If you bought a Xrumer blast from someone and want to check out how many profile links in the report are actually seo friendly, so I made a few php scripts. What will happen: Get rid of all duplicate entries in the report text file check each link in report file to find a specific text (domain name usually) save all found links into separate text file What You Need: Hosting with no timeout limit NOTE: Backup your report, just in case. 1 - save below code to removedupes.php and change PROFILESLIST.txt with the name of the report file (3rd to last line) PHP: <?php/** * RemoveDuplicatedLines * This function removes all duplicated lines of the given text file. * * @param string * @param bool * @return string */function RemoveDuplicatedLines($Filepath, $IgnoreCase=false, $NewLine="\n"){ if (!file_exists($Filepath)){ $ErrorMsg = 'RemoveDuplicatedLines error: '; $ErrorMsg .= 'The given file ' . $Filepath . ' does not exist!'; die($ErrorMsg); } $Content = file_get_contents($Filepath); $Content = RemoveDuplicatedLinesByString($Content, $IgnoreCase, $NewLine); // Is the file writeable? if (!is_writeable($Filepath)){ $ErrorMsg = 'RemoveDuplicatedLines error: '; $ErrorMsg .= 'The given file ' . $Filepath . ' is not writeable!'; die($ErrorMsg); } // Write the new file $FileResource = fopen($Filepath, 'w+'); fwrite($FileResource, $Content); fclose($FileResource); } /** * RemoveDuplicatedLinesByString * This function removes all duplicated lines of the given string. * * @param string * @param bool * @return string */function RemoveDuplicatedLinesByString($Lines, $IgnoreCase=false, $NewLine="\n"){ if (is_array($Lines)) $Lines = implode($NewLine, $Lines); $Lines = explode($NewLine, $Lines); $LineArray = array(); $Duplicates = 0; // Go trough all lines of the given file for ($Line=0; $Line < count($Lines); $Line++){ // Trim whitespace for the current line $CurrentLine = trim($Lines[$Line]); // Skip empty lines if ($CurrentLine == '') continue; // Use the line contents as array key $LineKey = $CurrentLine; if ($IgnoreCase) $LineKey = strtolower($LineKey); // Check if the array key already exists, // if not add it otherwise increase the counter if (!isset($LineArray[$LineKey])) $LineArray[$LineKey] = $CurrentLine; else $Duplicates++; } // Sort the array asort($LineArray); // Return how many lines got removed return implode($NewLine, array_values($LineArray)); }$lns = RemoveDuplicatedLines('PROFILESLIST.txt', true);echo "Removed duplicates";?> 2 - save below code to xrumerscan.php PHP: <?phpif(isset($_GET['u'])){$u = $_GET['u'];$f = $_GET['f'];$t = $_GET['t'];// get contents of a file into a string$filename = "$f.txt";$handle = fopen($filename, "r");$contents = fread($handle, filesize($filename));fclose($handle);$links = explode("\n",$contents);$i=0;$list = "";foreach($links as $link){$html = @file_get_contents($link);$pos = strpos($html, $u);if ($pos === false) {}else{echo $link . "<br>";$list .= $link . "\n";}}$ourFileName = "$t.txt";$ourFileHandle = fopen($ourFileName, 'w') or die("can't open file");fwrite($ourFileHandle, $list);fclose($ourFileHandle);}else{?>Url format: xrumerscan.php?u=texttosearch&f=profilefile&t=profileswithlinks<br>Example:<br>Report = profiles.txt (place in same folder as xrumscan.php)<br>Website = chuckycheese.com<br>Text File = Name of file to be created with all the seo approved links<br><br>Enter into address bar: xrumerscan.php?u=chuckcheese&f=profiles&t=seolinks<br><br>everything is case sensitive. can be run as cron.<?}?> 3 - upload xrumerscan.php, removedupes.php and xrumer report txt file to same folder 4 - make sure the folder is writable (chmod it) 5 - run removedupes.php and check the report file to make sure its been changed (less links, abc order) 6 - run xrumerscan.php and read the instructions 6b - you can also make it a cron job (after you read instructions) which means you wont have to idle the page. My example: I ordered a 200k blast from here and here are the changes: total blast - 200,000 original xrumer report - 26,964 links remove duplicates - 9,821 links seo check (final) - 3,074 links I don't have xrumer but I believe you can use your successful list in the future to save time and resources. Enjoy.
Thanks for sharing the script. The removedup.php script executes properly and gives the message "Removed duplicates" but when I run xrumerscan.php on my server, the script runs for a while then I get a 500 internal error. Does this mean i do not have hosting with no timeout?