1. This website uses cookies to improve service and provide a tailored user experience. By using this site, you agree to this use. See our Cookie Policy.
    Dismiss Notice

[Method] Broken link building and finding dropped domains with good authority

Discussion in 'White Hat SEO' started by skeye, Mar 21, 2018.

  1. skeye

    skeye Junior Member

    Joined:
    Dec 17, 2015
    Messages:
    197
    Likes Received:
    164
    Occupation:
    CM
    I recently dumped a file with 2,600 404 links on Wikipedia and another one with 190 Dropped Domains.

    Here's how you can find 404 links or drops relevant to your niche.

    1. Get a list of authoritative websites from your niche and a list of relevant Wikipedia articles.

    2. Download Netpeak Spider web crawler (2 weeks free, no limits, no CC required)

    3. Set it to check the first website from your list and find only external outgoing links

    [​IMG]
    Disable all other parameters to speed up crawling.

    4. Once it's done crawling, export the list of external links and repeat for all website you have.

    5. From your exported files remove all links that aren't AHREF type (IMG, IMG Ahref and etc)

    6. Get all of the external links you scraped from all of your websites and put them into one spreadsheet. When you export the links you will have few columns such as link type and etc, but you only need to keep two columns where links are coming from and where they're pointing at:
    [​IMG]

    7. Copy all links from the To URL column and paste them into Netpeak Spider. You'll have to switch it to crawl the list of URLs instead of an entire website.

    [​IMG]

    8. Change Spider's settings to check only Status codes of these URLs

    9. Once crawling is complete you can sort URLs by status code to get your 404 broken links.

    10. Paste these 404 links into your main Sheet with To URL and From URL columns.

    11. Use VLOOKUP to see which page links to 404 page.

    12. Contact website owners and offer to replace a dead link with a link to your (relevant) page

    How to find Dropped domains

    Go back to Step 9

    10. After you sort URLs by status code you'll see that some URLs don't have any status (not 4xx, not 5xx), it'll say Error.
    This means that link from you list points to a non-existent website

    11. Get these Zero Status links and check them through Godaddy's Bulk Checker


    [​IMG]

    Alternatively, what I prefer doing.

    Download Netpeak Checker (same company, same free trial)

    And paste your Zero Status links into the checker. And set it to check the following parameters:

    MOZ Domain Authority (You'll have to get a free API key from MOZ)
    WHOIS availability

    And you'll get similar results:

    [​IMG]

    You'll see if domain is available (TRUE) and if it's any good (Domain Authority)

    What to do next?

    Once you found a drop, check it's backlink profile, with any Backlink tool

    [​IMG]
    This is one of the drops I found while writing this post

    If profile looks good buy the domain.

    Restore site structure by checking which pages were linked most often.

    Set up redirects to your own website.

    ?????

    Profit.
     
    • Thanks Thanks x 27
  2. wisdomkid

    wisdomkid Elite Member

    Joined:
    Jun 20, 2011
    Messages:
    3,288
    Likes Received:
    1,016
    Wonderful share.. Thanks and thanks given too
     
  3. aka_ab

    aka_ab Regular Member

    Joined:
    Dec 1, 2017
    Messages:
    262
    Likes Received:
    90
    Gender:
    Male
    Nice share. Thanks !
     
  4. darulez

    darulez Elite Member

    Joined:
    Mar 12, 2013
    Messages:
    3,304
    Likes Received:
    1,172
    Gender:
    Male
    Occupation:
    Messing with Clickz
    Location:
    In da Hood
    Nargil is now out of orders
     
  5. spectrejoe

    spectrejoe Elite Member

    Joined:
    Sep 25, 2013
    Messages:
    2,952
    Likes Received:
    978
    I'm doing this right now to find a decent domain to use as money site, thanks!
     
  6. virtualbyron

    virtualbyron Jr. VIP Jr. VIP

    Joined:
    May 11, 2014
    Messages:
    1,664
    Likes Received:
    1,403
    Occupation:
    [email protected]
    I just read a broken link building method from ahref blog, your method is so much better!

    This thread is level JR VIP!
     
    • Thanks Thanks x 4
  7. seoz87

    seoz87 Senior Member

    Joined:
    Oct 31, 2008
    Messages:
    870
    Likes Received:
    517
    Gender:
    Male
    Occupation:
    Digital Marketing Something
    Location:
    Pakistan
    Home Page:
    just tried this method. Working perfectly. However its taking too much time as you have to enter sites 1 by 1. Is it possible to add complete list in one go?
     
  8. skeye

    skeye Junior Member

    Joined:
    Dec 17, 2015
    Messages:
    197
    Likes Received:
    164
    Occupation:
    CM
    No, I don't think so. It either crawls an entire website, so one by one, or a list of urls, but to get the list of URLs you need to crawl the website
     
    • Thanks Thanks x 1
  9. Frap

    Frap Regular Member

    Joined:
    Feb 6, 2018
    Messages:
    236
    Likes Received:
    109
    Gender:
    Male
    Occupation:
    Consultant
    Location:
    New York, NY USA
    Wow nice, thanks!
     
  10. Jure321

    Jure321 Regular Member

    Joined:
    Jan 30, 2016
    Messages:
    457
    Likes Received:
    106
    Gender:
    Male
    Occupation:
    Intermediate IM Guy
    Location:
    Croatia
    great share bro! Thank you very much
     
  11. Andy Malloy

    Andy Malloy Newbie

    Joined:
    Dec 7, 2017
    Messages:
    12
    Likes Received:
    5
    Gender:
    Male
    I was reading an article by backlinko about this topic and have the same impression. It's like 'Hi, do you want to do it like a pro?'.

    THANKS SKEYE! I will tell about it to our company SEO fella.
     
    • Thanks Thanks x 1
  12. Nova90

    Nova90 Registered Member

    Joined:
    Jul 5, 2017
    Messages:
    82
    Likes Received:
    30
    Great post! Will definitely be using this method :)
     
  13. spectrejoe

    spectrejoe Elite Member

    Joined:
    Sep 25, 2013
    Messages:
    2,952
    Likes Received:
    978
    for the dropped domains I get no server status saying "Error" only for the dropped domains. I ran ALL the errors through the spider and there was plenty that were available but having to run all the errors makes the list turn HUGE...
     
  14. skeye

    skeye Junior Member

    Joined:
    Dec 17, 2015
    Messages:
    197
    Likes Received:
    164
    Occupation:
    CM
    you're probably doing something different from what I did. Can you explain a bit more what problem you're facing? I used to have an intial list of external links too big, asked their support and they showed me how to use RegEx filter to remove domains that are 100% not drops, like wikimedia, youtube, bbc and etc. Narrowed down the list from 200k to 60K

    Other than that I didn't have any problems.
     
  15. spectrejoe

    spectrejoe Elite Member

    Joined:
    Sep 25, 2013
    Messages:
    2,952
    Likes Received:
    978
    I'm 100% doing the same thing you said in ur post, I crawl a whole site, export external links.

    Filter out all NON AHREF links.

    Get the links into scrapebox, trim to root, remove duplicates, place them again in the program, crawl again the list of URLs FOR STATUS CODES, export URLs + status codes.


    Should have "Error" status codes but I don't have them despite having available domains
     
  16. skeye

    skeye Junior Member

    Joined:
    Dec 17, 2015
    Messages:
    197
    Likes Received:
    164
    Occupation:
    CM
    That's weird. Did you try sorting by status before exporting to see if it works in the program? I think after I export it, the ones that have error status end up with empty cells in status codes.

    Try running the list through Checker software and see if it displays status codes correctly.

    I would contact their support about it, they helped me with RegEx and it was fast and painless.
     
  17. phoenix9

    phoenix9 Power Member

    Joined:
    May 30, 2012
    Messages:
    639
    Likes Received:
    234
    Wow thanks man this is GOLD
     
  18. terrycody

    terrycody Elite Member

    Joined:
    Sep 29, 2012
    Messages:
    3,032
    Likes Received:
    1,033
    Occupation:
    marketer
    Location:
    Hell
    This is a nice method shared! Though I cant read it because my PC cant use this tool, for a personal reason lol so I just cant follow.

    I tried to deciper the whole process in a text version in my head but got a bit headache lol, the conclusion of your first part :

    "Contact website owners and offer to replace a dead link with a link to your (relevant) page"

    This is the final base point I understand, so could u write the theory behind this tool? Like A site got broken outbound links, so u simply use this tool to find those links?!
     
  19. skeye

    skeye Junior Member

    Joined:
    Dec 17, 2015
    Messages:
    197
    Likes Received:
    164
    Occupation:
    CM
    Pretty much. You use it to crawl a website and see if there are any 404 external links. Then you check where that link is coming from and where is it supposed to lead.
    For example, if it's from an article about bikes and links to another article where it says that Ducati 1199 has a top speed of 234kph. You simply create a post about ducati 1199 list its characteristics and email them saying that they've lost a source, here's the one you can link to.
     
    • Thanks Thanks x 1
  20. Seven4

    Seven4 Elite Member

    Joined:
    Nov 28, 2013
    Messages:
    1,938
    Likes Received:
    336
    Nice thread. Excellent method!