Finding PBNs (strategy)

Jonaz86

Junior Member
Joined
Sep 16, 2015
Messages
140
Reaction score
5
Hello guys,

I have a server running 24/7 that extract links from websites with high authoritive, CNN etc. Despite extracting links for more then 10-11 levels deep I come up short with 30-40 domains that really sucks. Highest TF/CF is around 20/30 that are spammed and all others are below 15.

Please can someone give me any advice here? I want to master the scraping but I cant seem to get anything to work using this method. Im desperate ... I know most guys dont care to help but the ocean is big enough for all the fishes to live and prosper together ;)

Thanks in advance
 
everyone and their cat is scraping the big authority sites cnn, bbc etc. Go for lower authority sites/very high quality directories etc
 
What software are u using for scraping? Your own bot?
 
Scraping individual sites would be too tedious and time consuming. Try this way,
1. Find a list of domains that are about to be deleted/expired (check for those pre-releases etc to get these list by date)
2. Run bulk metrics for DA/TF etc, now you have metrics filtered domains.
3. Match them against the list of authority sites(CNN etc) using google index or any backlinks analyzer to see if your domains have any backlinks in those listed sites.
 
I start with the sitemap rather than try to scrape x levels deep.

1. extract list of all urls from target site
2. go through list with software to get all the external links
3. trim results to root and remove dups
4. send resulting domain list to a bulk availability checker
5. get the metrics for the available ones and filter it down some

still finding good old domains but as mentioned above everyone's been though the obvious ones.
You need a good starting point.
 
Well im basically using Scrapebox, so your saying the method is good its only that I need to target lesser known websites that still has strong authority?
 
This is what i like to do, you will need paid accounts for Ahref and/or Majestic SEO


  1. Search google for your broad niche keyword, set the search date in google between 2001 and 2005
  2. Now scrape or manually copy/paste the authority sites
  3. Copy each url into Ahref and/or Mahestric SEO
  4. Download the list of backlinks
  5. Clean the backlinks to domain.com
  6. Go to a bulk availability checker and see what is available
  7. Check the metrics/backlink profile and register the domains for $1.99 (if you use fresh Godaddy accounts with coupons)

I have found a lot of great domains that way, it is manual labor though....
 
This is what i like to do, you will need paid accounts for Ahref and/or Majestic SEO


  1. Search google for your broad niche keyword, set the search date in google between 2001 and 2005
  2. Now scrape or manually copy/paste the authority sites
  3. Copy each url into Ahref and/or Mahestric SEO
  4. Download the list of backlinks
  5. Clean the backlinks to domain.com
  6. Go to a bulk availability checker and see what is available
  7. Check the metrics/backlink profile and register the domains for $1.99 (if you use fresh Godaddy accounts with coupons)

I have found a lot of great domains that way, it is manual labor though....
Hi there,

Im basically using this alternative method as well but using Scrapebox for the entire process. Ive heard going back that long in time isnt good since it raises potentially spammed domains by far and also lot of the links has expired etc, but me myself arent quite sure about that.

So you only take the top 10, 20 sites that comes up for each nische instead of scraping trough all of them? Makes sense!

Thanks
 
This is what i like to do, you will need paid accounts for Ahref and/or Majestic SEO


  1. Search google for your broad niche keyword, set the search date in google between 2001 and 2005
  2. Now scrape or manually copy/paste the authority sites
  3. Copy each url into Ahref and/or Mahestric SEO
  4. Download the list of backlinks
  5. Clean the backlinks to domain.com
  6. Go to a bulk availability checker and see what is available
  7. Check the metrics/backlink profile and register the domains for $1.99 (if you use fresh Godaddy accounts with coupons)

I have found a lot of great domains that way, it is manual labor though....

Good advice right there! Manual almost always means most people are lazy to do it - which means higher chance of success.

You can take it one step further and also add site:domain.com for each authority domain to get more related pages on those authority sites and rinse and repeat.
 
Haven't you tried scrapebox?It is the perfect thing to find the perfect domain with high TF/CF.Try it once.
 
1. Scrape google by country, keyword or both using generic or your own target keyword list
2. You now have a list of sites in your niche (Topical) now sort your list for scrapeing
3. To get the equivalent of a homepage link just scrape every single home page for broken links then ping a domain registrar for availability, if its available, but it, rebuild via archive.org or do whatever
4. Want a shit ton of pbn domains? scrape to the 5th click level, this may take weeks but you will have a database of domains which will last you years then do whatever you like with the results
5. oops forgot to metion, check all metrics before domain purchase, check archive.org for all the nasties.

This is over simplified, and I do not want to piss off the shit sellers here by spilling how simple this all is but anyhoo as requested thats hoe you do it.
 
Good advice right there! Manual almost always means most people are lazy to do it - which means higher chance of success.

You can take it one step further and also add site:domain.com for each authority domain to get more related pages on those authority sites and rinse and repeat.

This is what i do when i am searching for HQ domains, manual always beat automation in this, i don't even outsource it as i am very specific in my domains, en because like you said bots leave a lot on the table and people are to lazy to go and search manually you will find some real gems, with great backlinks and TF 30/40+ in the less populair niches
 
This is what i like to do, you will need paid accounts for Ahref and/or Majestic SEO


  1. Search google for your broad niche keyword, set the search date in google between 2001 and 2005
  2. Now scrape or manually copy/paste the authority sites
  3. Copy each url into Ahref and/or Mahestric SEO
  4. Download the list of backlinks
  5. Clean the backlinks to domain.com
  6. Go to a bulk availability checker and see what is available
  7. Check the metrics/backlink profile and register the domains for $1.99 (if you use fresh Godaddy accounts with coupons)

I have found a lot of great domains that way, it is manual labor though....


thanks, appreciated
 
I would say you are doing well already, it is not a very easy thing to do as there are heavy weights all around focused upon these sites. May be look for least popular authority sites to extract expired domains from.
 
I would say you are doing well already, it is not a very easy thing to do as there are heavy weights all around focused upon these sites. May be look for least popular authority sites to extract expired domains from.
Thanks, Im wondering how to make it go faster. I assume more proxies is the way to go. Waiting 2-3 days for 1 domain is... hmm well not effecient. Using around 30-60 proxies with 100-200 threads. Going several level deep up to 10 on biggest ones.
 
I cant find even 1 domain with good metrics (above 15 TF) thats not spammed.

Im starting to wonder if finding Domains yourself is even possible anymore.. despite using all the methods and despite investing in equipment where im able to process huge amount of information where scrapebox runs 24/7 for several days.. i still come off empty handed. I simply dont get it.

:(
 
I start with the sitemap rather than try to scrape x levels deep.

1. extract list of all urls from target site
2. go through list with software to get all the external links
3. trim results to root and remove dups
4. send resulting domain list to a bulk availability checker
5. get the metrics for the available ones and filter it down some

still finding good old domains but as mentioned above everyone's been though the obvious ones.
You need a good starting point.
I think All the problem of Zonaz86 is solved by you. Really working and helpful information.
 
I think All the problem of Zonaz86 is solved by you. Really working and helpful information.
Thank you for a reply, its kinda crazy when one wakes up from bed hoping for a reply on BHW regarding domains to turn his business around.

I read that post and i must admit i did raise this question with a guy that I know is a successful within SEO. He said that Scraping several levels deep is better and that sitemap pretty much does the same thing but without being able to get as deep since many websites dont have sitemaps.

Sorry for asking this but why would this method be better then what im doing already? I will try it of course just curious as I want to know the details behind it.
 
Also finding the sitemap is hard on many websites, not only do they hide it but also ive noticed they using .gz which Scrapebox cant read..

EDIT: Theyre easily dl using download managers etc and can then be extracted, my bad. But my post above still stands ignore this secondary reply.
 
Last edited:
Back
Top