Instantly Create Do-Follow Backlinks from 1,099,181 Unique Domains

You can also do it easily with excel, just copy the million domains in column 1, on second column then the rest /__media__/js/netsoltrademark.php?d=mydomain.com

Then copy everything and paste it e.g. in editor, ready are the links.
 
well, I just realized that appending those things in notepad for 1 million websites that have different extensions (dozens of them, too) and start with every letter of the alphabet is excruciating. So, I'm not gonna do it, but still, I appreciate the tip :)



may I get a fool-proof step-by-step tutorial on how to do this exactly? I'm not a tech-savvy dude, and I don't even know what shell is, much less how to use it :p
erm nope sorry you can google on how to execute php scripts there are already many tutorials. some basic knowledge of servers is required in SEO :)
 
GSA SER doesn't have a database. You would import your huge list into a project, select the Fast Indexer engine and let it rip!

Sorry about that, I meant that I've tested with a lot of the databases that people give out for free to GSA users.
 
They are normally sold or giving around in private Slavic groups I am in sometimes.

For example I am making my own mini one to do that sort of thing but not going to sell it.
what you would use this kind of links for ? Maybe on chun and burn or parasites?
 
you can also easily add your own url with php, i just made a quick script:

Code:
<?php
$file="urls.txt";
$url="yourdomain.tld";
$fh = fopen($file,'r');
if($fh) {
   while(($l = fgets($fh)) !== FALSE) {
      print trim($l).$url.PHP_EOL;
   }
   fclose($fh);
} else {
   print "Error: couldn't open file!".PHP_EOL;
}
?>

create a file with the url list of those sites (url.txt and each url on a new line) and change $url to your own domain.
save the file as append.php then you can run it in the shell like:

php append.php > output.txt

now you have all the urls in "output.txt", just have to get them indexed now ;)
I think some sites are dead and we should remove them before indexing. You should've implemented testing & removing 404 links.

well, I just realized that appending those things in notepad for 1 million websites that have different extensions (dozens of them, too) and start with every letter of the alphabet is excruciating. So, I'm not gonna do it, but still, I appreciate the tip :)
It'll take years to do this in notepad. You need powerful text editors like Notepad++. Click "Replace" button, choose extended option, and replace "^" with "http://" to add it to the beginning. Now replace "$" with "/__media__/js/netsoltrademark.php?d=mydomain.com" to add it to the end. All your 1 million links will be created in just 2 steps.

If you're new to these tools, it's better to copy 10-20 links to a new file to test it. The processing time may be a few minutes depending on your computer and you don't want to learn these things on 1 million links at once.
 
It'll take years to do this in notepad. You need powerful text editors like Notepad++. Click "Replace" button, choose extended option, and replace "^" with "http://" to add it to the beginning. Now replace "$" with "/__media__/js/netsoltrademark.php?d=mydomain.com" to add it to the end. All your 1 million links will be created in just 2 steps.

If you're new to these tools, it's better to copy 10-20 links to a new file to test it. The processing time may be a few minutes depending on your computer and you don't want to learn these things on 1 million links at once.
solid guide, thanks!
 
Thanks, definitely gonna test it on parasites!
 
This is a known method and I'm not going to make a huge post about how to use the backlinks.

I'm simply providing a list of domains that will be useful for anyone interested in using the method. All of them are do-follow.

Here's a list of the domains (over a million):

Code:
https://file.io/WrNdyCGDg3va
https://www.mediafire.com/file/kqlnc2wjz7wok6a/bhw-million-domains.txt/file
https://gofile.io/d/MHPfMY

INSTRUCTIONS

I'll use the first domain on the list as an example - glowiththeflow.com

Let's say you want to create a backlink to blackhatworld.com from that domain.

You'll be able to do that automatically by simply loading this URL:

Code:
http://glowiththeflow.com/__media__/js/netsoltrademark.php?d=blackhatworld.com

You can also link to inner pages like this:

Code:
http://glowiththeflow.com/__media__/js/netsoltrademark.php?d=www.blackhatworld.com/forums/black-hat-seo.28/

You should always leave off http:// and https:// - that'll cause the backlink to break.

This same method works across all of the domains in my list. It's caused by the way Network Solutions (and other Web.com brands) generate their trademark notice page on parked domains. If a domain ends up expiring, is transferred elsewhere, etc. - then the page will stop working. But with the huge amount of domains, I'd guess that hundreds of thousands of them should remain active for years into the future.

QUICK NOTES

These backlinks are probably best for tiered link building, pyramids or churn & burn sites.

I probably wouldn't recommend them for money sites unless you can come up with a unique way to do it.
Thank you
 
I think some sites are dead and we should remove them before indexing. You should've implemented testing & removing 404 links.
ok this is the spoon fed version :p

Code:
<?php
$file="urls.txt";
$url="yourdomain.tld";
$link='';
$httpcode=0;
$fh = fopen($file,'r');

function get_httpcode($url) {
   $code = 0;
   $curl = curl_init();
   curl_setopt($curl,CURLOPT_URL,$url);
   curl_setopt($curl,CURLOPT_USERAGENT,'Mozilla');
   curl_setopt($curl,CURLOPT_SSL_VERIFYPEER,FALSE);
   curl_setopt($curl,CURLOPT_SSL_VERIFYHOST,2);
   curl_setopt($curl,CURLOPT_RETURNTRANSFER,1);
   curl_setopt($curl,CURLOPT_HEADER,TRUE);
   curl_setopt($curl,CURLOPT_CONNECTTIMEOUT,5);
   curl_setopt($curl,CURLOPT_TIMEOUT,10);
   $out = curl_exec($curl);
   $code = curl_getinfo($curl,CURLINFO_HTTP_CODE);
   curl_close($curl);
   return $code;
}

if($fh) {
   while(($l = fgets($fh)) !== FALSE) {
      $link = trim($l).$url;
      // or use the following if the list only contains domains:
      // $link = 'http://'.trim($l).'/__media__/js/netsoltrademark.php?d='.$url.PHP_EOL;
      $httpcode = get_httpcode($link);
      if($httpcode==200) {
         print $link.PHP_EOL;
      }
   }
   fclose($fh);
} else {
   print "Error: couldn't open file!".PHP_EOL;
}
?>
 
Back
Top
AdBlock Detected

We get it, advertisements are annoying!

Sure, ad-blocking software does a great job at blocking ads, but it also blocks useful features and essential functions on BlackHatWorld and other forums. These functions are unrelated to ads, such as internal links and images. For the best site experience please disable your AdBlocker.

I've Disabled AdBlock