SpamHat
Junior Member
- Apr 27, 2009
- 151
- 67
( Disclaimer: This is a quick script written in 5 mins, not a work of art. It's a quick and dirty link checker - no more, no less. )
What It Does:
A few days ago I wrote a quick script in php to check a list of URLs for a string (my url).
I wanted to check some of the link services here and on other forums but didn't feel like manually checking 300 urls to see if my links were there.
The script will visit each URL in turn checking the page for a unique string and report back on what it found, how many of each etc.
How To Use It:
The Script:
It's here for a quick sniff, copy and paste. I thought about uploading as a file also, but if I make any changes it will be easier like this.
It just uses straight cURL to grab the urls, not curl multi. Might do one with curl multi (much faster) if anyone likes this one.
So there you go.
May help you get an overview on the effectiveness of some of these link batch's you buy.
I would advise checking once your link work's done, then after a few days and comparing results.
If you have any problems with it post here and I'll probably be able to help.
What It Does:
A few days ago I wrote a quick script in php to check a list of URLs for a string (my url).
I wanted to check some of the link services here and on other forums but didn't feel like manually checking 300 urls to see if my links were there.
The script will visit each URL in turn checking the page for a unique string and report back on what it found, how many of each etc.
How To Use It:
- Change the $checkFor variable to the string to check for. This should be your url, or just any unique word/phrase you're looking for.
- Paste in all the urls you want to check into the variable $urlsToCheck. It doesn't matter about extra space at the beginning and end. They should be 1 per line.
- Upload to your server and run the script in your web browser. You can also use LAMP/XAMPP if you like.
The Script:
It's here for a quick sniff, copy and paste. I thought about uploading as a file also, but if I make any changes it will be easier like this.
It just uses straight cURL to grab the urls, not curl multi. Might do one with curl multi (much faster) if anyone likes this one.
PHP:
<?php
set_time_limit(0);
// Quick link checker script
// http://www.blackhatworld.com/blackhat-seo/black-hat-seo-tools/133787-get-bulk-link-checker-great-check-effectivness-link-bundles-you-buy.html
// by SpamHat
// How to use:
// 1. Change the $checkFor variable to the string to check for. This should be your url, or just any unique word/phrase you're looking for.
// 2. Paste in all the urls you want to check into the variable $urlsToCheck. It doesn't matter about extra space at the beginning and end. They should be 1 per line, like the example below.
// 3. Run the script in your web browser.
$checkFor = "google";
$urlsToCheck = "
http://www.google.com
http://www.yahoo.com
http://www.dmoz.org
";
/////////////////////////////////////////////////////////////////////////////////
// No need to edit below here unless you feel like messing around
/////////////////////////////////////////////////////////////////////////////////
$urls = explode("\n", trim(str_replace("\r", "", $urlsToCheck)));
$goodNum = 0;
$goodUrls = array();
$badNum = 0;
$badUrls = array();
$timeoutNum = 0;
$timeoutUrls = array();
$total = count($urls);
echo "<style>* {font-family:Verdana;} p,span {font-size:10px;} .r,.r a {color:red;} .b,.b a {color:blue;} .g,.g a {color:green;}</style>";
echo "<hr><h1>Checked URLs</h1>";
$i=0;
foreach($urls as $url) {
$html = get($url);
if(empty($html)) {
$col = "b";
++$timeoutNum;
$timeoutUrls[] = $url;
} elseif(strstr($html, $checkFor)) {
$col = "g";
++$goodNum;
$goodUrls[] = $url;
} else {
$col = "r";
++$badNum;
$badUrls[] = $url;
}
echo "<span class='$col'><a href='$url'>$url</a></span><br/>";
++$i;
// if($i >= 10) break;
}
echo "<hr><h1>Results</h1>";
echo "<p class='g' style='font-size:20px;'>Good: <b>$goodNum</b> / $total</p>";
echo "<p class='r' style='font-size:20px;'>Bad: <b>$badNum</b> / $total</p>";
echo "<p class='b' style='font-size:20px;'>TimedOut: <b>$timeoutNum</b> / $total</p>";
echo "<hr><h1>Good Urls ($goodNum)</h1><span>";
foreach($goodUrls as $url) echo "<a class='g' href='$url'>$url</a><br/>";
echo "</span>";
echo "<hr><h1>Bad Urls ($badNum)</h1><span>";
foreach($badUrls as $url) echo "<a class='r' href='$url'>$url</a><br/>";
echo "</span>";
echo "<hr><h1>Timed-Out Urls ($timeoutNum)</h1><span>";
foreach($timeoutUrls as $url) echo "<a class='b' href='$url'>$url</a><br/>";
echo "</span>";
function get($url) {
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl, CURLOPT_USERAGENT, "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)");
curl_setopt($curl, CURLOPT_HTTPHEADER, array("Accept: text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5", "Cache-Control: max-age=0", "Connection: keep-alive", "Keep-Alive: 300", "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7", "Accept-Language: en-us,en;q=0.5", "Pragma: "));
curl_setopt($curl, CURLOPT_ENCODING, "gzip,deflate");
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curl, CURLOPT_TIMEOUT, 10);
curl_setopt($curl, CURLOPT_AUTOREFERER, 1);
curl_setopt($curl, CURLOPT_FOLLOWLOCATION, 1);
$html = curl_exec($curl);
curl_close($curl);
return $html;
}
?>
May help you get an overview on the effectiveness of some of these link batch's you buy.
I would advise checking once your link work's done, then after a few days and comparing results.
If you have any problems with it post here and I'll probably be able to help.