1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Finding PBNs (strategy)

Discussion in 'Black Hat SEO' started by Jonaz86, Jan 2, 2016.

  1. Jonaz86

    Jonaz86 Junior Member

    Joined:
    Sep 16, 2015
    Messages:
    140
    Likes Received:
    5
    Hello guys,

    I have a server running 24/7 that extract links from websites with high authoritive, CNN etc. Despite extracting links for more then 10-11 levels deep I come up short with 30-40 domains that really sucks. Highest TF/CF is around 20/30 that are spammed and all others are below 15.

    Please can someone give me any advice here? I want to master the scraping but I cant seem to get anything to work using this method. Im desperate ... I know most guys dont care to help but the ocean is big enough for all the fishes to live and prosper together ;)

    Thanks in advance
     
  2. deancow

    deancow Jr. VIP Jr. VIP

    Joined:
    Jul 8, 2009
    Messages:
    682
    Likes Received:
    243
    everyone and their cat is scraping the big authority sites cnn, bbc etc. Go for lower authority sites/very high quality directories etc
     
  3. qlithe

    qlithe Power Member

    Joined:
    Feb 14, 2012
    Messages:
    656
    Likes Received:
    96
    What software are u using for scraping? Your own bot?
     
  4. qrazy

    qrazy Senior Member

    Joined:
    Mar 19, 2012
    Messages:
    1,115
    Likes Received:
    1,723
    Location:
    Banana Republic
    Scraping individual sites would be too tedious and time consuming. Try this way,
    1. Find a list of domains that are about to be deleted/expired (check for those pre-releases etc to get these list by date)
    2. Run bulk metrics for DA/TF etc, now you have metrics filtered domains.
    3. Match them against the list of authority sites(CNN etc) using google index or any backlinks analyzer to see if your domains have any backlinks in those listed sites.
     
    • Thanks Thanks x 2
  5. tb303

    tb303 Power Member

    Joined:
    Dec 18, 2011
    Messages:
    734
    Likes Received:
    388
    I start with the sitemap rather than try to scrape x levels deep.

    1. extract list of all urls from target site
    2. go through list with software to get all the external links
    3. trim results to root and remove dups
    4. send resulting domain list to a bulk availability checker
    5. get the metrics for the available ones and filter it down some

    still finding good old domains but as mentioned above everyone's been though the obvious ones.
    You need a good starting point.
     
  6. Jonaz86

    Jonaz86 Junior Member

    Joined:
    Sep 16, 2015
    Messages:
    140
    Likes Received:
    5
    Well im basically using Scrapebox, so your saying the method is good its only that I need to target lesser known websites that still has strong authority?
     
  7. fc-dh

    fc-dh Elite Member

    Joined:
    Oct 20, 2012
    Messages:
    1,800
    Likes Received:
    1,393
    Occupation:
    Blackhatting
    Location:
    Den Haag | Netherlands
    This is what i like to do, you will need paid accounts for Ahref and/or Majestic SEO


    1. Search google for your broad niche keyword, set the search date in google between 2001 and 2005
    2. Now scrape or manually copy/paste the authority sites
    3. Copy each url into Ahref and/or Mahestric SEO
    4. Download the list of backlinks
    5. Clean the backlinks to domain.com
    6. Go to a bulk availability checker and see what is available
    7. Check the metrics/backlink profile and register the domains for $1.99 (if you use fresh Godaddy accounts with coupons)

    I have found a lot of great domains that way, it is manual labor though....
     
    • Thanks Thanks x 2
  8. Jonaz86

    Jonaz86 Junior Member

    Joined:
    Sep 16, 2015
    Messages:
    140
    Likes Received:
    5
    Hi there,

    Im basically using this alternative method as well but using Scrapebox for the entire process. Ive heard going back that long in time isnt good since it raises potentially spammed domains by far and also lot of the links has expired etc, but me myself arent quite sure about that.

    So you only take the top 10, 20 sites that comes up for each nische instead of scraping trough all of them? Makes sense!

    Thanks
     
  9. accelerator_dd

    accelerator_dd Jr. VIP Jr. VIP

    Joined:
    May 14, 2010
    Messages:
    2,448
    Likes Received:
    1,009
    Occupation:
    SEO
    Location:
    IM Wonderland
    Good advice right there! Manual almost always means most people are lazy to do it - which means higher chance of success.

    You can take it one step further and also add site:domain.com for each authority domain to get more related pages on those authority sites and rinse and repeat.
     
  10. Ambitious12

    Ambitious12 Elite Member

    Joined:
    Jun 26, 2014
    Messages:
    3,097
    Likes Received:
    608
    Occupation:
    No Occupation
    Location:
    Among the Stars
    Haven't you tried scrapebox?It is the perfect thing to find the perfect domain with high TF/CF.Try it once.
     
  11. GoTRooT

    GoTRooT Power Member

    Joined:
    Jun 21, 2010
    Messages:
    514
    Likes Received:
    242
    Occupation:
    Englland
    Location:
    Englland
    1. Scrape google by country, keyword or both using generic or your own target keyword list
    2. You now have a list of sites in your niche (Topical) now sort your list for scrapeing
    3. To get the equivalent of a homepage link just scrape every single home page for broken links then ping a domain registrar for availability, if its available, but it, rebuild via archive.org or do whatever
    4. Want a shit ton of pbn domains? scrape to the 5th click level, this may take weeks but you will have a database of domains which will last you years then do whatever you like with the results
    5. oops forgot to metion, check all metrics before domain purchase, check archive.org for all the nasties.

    This is over simplified, and I do not want to piss off the shit sellers here by spilling how simple this all is but anyhoo as requested thats hoe you do it.
     
    • Thanks Thanks x 1
  12. fc-dh

    fc-dh Elite Member

    Joined:
    Oct 20, 2012
    Messages:
    1,800
    Likes Received:
    1,393
    Occupation:
    Blackhatting
    Location:
    Den Haag | Netherlands
    This is what i do when i am searching for HQ domains, manual always beat automation in this, i don't even outsource it as i am very specific in my domains, en because like you said bots leave a lot on the table and people are to lazy to go and search manually you will find some real gems, with great backlinks and TF 30/40+ in the less populair niches
     
    • Thanks Thanks x 2
  13. Luka19

    Luka19 Power Member

    Joined:
    Jun 23, 2014
    Messages:
    517
    Likes Received:
    125

    thanks, appreciated
     
  14. seanbadwin

    seanbadwin Newbie

    Joined:
    Jan 5, 2016
    Messages:
    10
    Likes Received:
    0
    I would say you are doing well already, it is not a very easy thing to do as there are heavy weights all around focused upon these sites. May be look for least popular authority sites to extract expired domains from.
     
  15. Jonaz86

    Jonaz86 Junior Member

    Joined:
    Sep 16, 2015
    Messages:
    140
    Likes Received:
    5
    Thanks, Im wondering how to make it go faster. I assume more proxies is the way to go. Waiting 2-3 days for 1 domain is... hmm well not effecient. Using around 30-60 proxies with 100-200 threads. Going several level deep up to 10 on biggest ones.
     
  16. Jonaz86

    Jonaz86 Junior Member

    Joined:
    Sep 16, 2015
    Messages:
    140
    Likes Received:
    5
    I cant find even 1 domain with good metrics (above 15 TF) thats not spammed.

    Im starting to wonder if finding Domains yourself is even possible anymore.. despite using all the methods and despite investing in equipment where im able to process huge amount of information where scrapebox runs 24/7 for several days.. i still come off empty handed. I simply dont get it.

    :(
     
  17. RoXt3R

    RoXt3R Jr. VIP Jr. VIP

    Joined:
    Apr 24, 2011
    Messages:
    370
    Likes Received:
    20
    Gender:
    Male
    Occupation:
    Internet Marketing
    Location:
    UK
    I think All the problem of Zonaz86 is solved by you. Really working and helpful information.
     
  18. Jonaz86

    Jonaz86 Junior Member

    Joined:
    Sep 16, 2015
    Messages:
    140
    Likes Received:
    5
    Thank you for a reply, its kinda crazy when one wakes up from bed hoping for a reply on BHW regarding domains to turn his business around.

    I read that post and i must admit i did raise this question with a guy that I know is a successful within SEO. He said that Scraping several levels deep is better and that sitemap pretty much does the same thing but without being able to get as deep since many websites dont have sitemaps.

    Sorry for asking this but why would this method be better then what im doing already? I will try it of course just curious as I want to know the details behind it.
     
  19. Jonaz86

    Jonaz86 Junior Member

    Joined:
    Sep 16, 2015
    Messages:
    140
    Likes Received:
    5
    Also finding the sitemap is hard on many websites, not only do they hide it but also ive noticed they using .gz which Scrapebox cant read..

    EDIT: Theyre easily dl using download managers etc and can then be extracted, my bad. But my post above still stands ignore this secondary reply.
     
    Last edited: Jan 11, 2016