1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

[Guide] How I got 140 do-follow Web 2.0 subdomain services!

Discussion in 'Black Hat SEO' started by LockerWizPro, May 1, 2016.

  1. LockerWizPro

    LockerWizPro Jr. VIP Jr. VIP

    Joined:
    Aug 10, 2010
    Messages:
    132
    Likes Received:
    204
    I posted a post listing 140 do-follow Web 2.0 services that offer free subdomains. I didn't want it to get spammed so I decided to only give it out via PM, then later I decided to only give it out to members with 100+ posts and 25+ thanks received. You can see the post in the link under this and shoot me a PM if you want the list. For those of you who have under 100 posts/25 thanks received, or if you want to try to find more Web 2.0 services, I'll give you a guide on how I got those links. Note, this guide takes a long time and is a lot of work. All the steps are manual however you might be able to automate them using a service not mentioned in this guide.

    Post:
    Code:
    http://www.blackhatworld.com/blackhat-seo/black-hat-seo/840096-get-140-active-do-follow-web-2-0-sites-offer-subdomains.html

    Step 1:
    Head over to Google. This one is simple enough. If you need further help with this step I suggest you stop reading this and maybe never return to BHW.

    Step 2:
    We are going to want to find sites that are hosted on Web 2.0 services, so we can find the services. Now I wasn't targeting any language, so used numbers for my search. The google search query I used was "site:*95.*.com". What this does is searches for sites that have a subdomain ending in 91, and end in .com. My username is docvet95, so if I were a novice wanting to make a free site I might make it at docvet95.wix.com. Go ahead and select all the domains in the results and paste it in a notepad. (for example, if I see docvet95.wix.com I will copy wix.com).

    Step 3:

    Now you'll find that you're going to eventually hit a domain that you're going to see a lot of. For example, you might see docvet95.wix.com for 5 pages or so, or maybe it will switch up from docvet95.wix.com to somethingelse95.wix.com, but you don't need that because you already have wix.com. To avoid this we're going to change up the search query to "site:*95.*.com -wix.com". Doing this removes all the results at wix.com. Google search only lets you use 32 words in a search query so only put the sites that really come up a lot.

    Step 4:
    Once you have copied all the results you've found for "site:*95.*.com", change it up for "site:*94.*.com", then 93, an so on. You can use other queries as well, whatever you can think of.

    Step 5:

    Once you've got all your results for "site:*xx.*.com" you can switch it up to "site:*xx.*.net", or .org, or any other TLD you want to find. Just do the whole thing over again, and keep all your results in a notepad.

    Step 6:
    Now that you have tons of websites in a notepad, you're not sure which ones are actual free Web 2.0 services or which ones are just websites that have different subdomains. We're going to want to sort through all of these, but a lot of them won't even be in English so it might be hard to just look at them and see what service they offer.

    I found a service online that can check languages, as well as titles. It's Netpeak Checker (I am not affiliated in any way). I won't link to it here but you can find it via Google. You can put all the sites in there then export it to Excel, then sort by languages.

    Step 7:
    Now we have an Excel sheet with all the domains, languages, and their titles. You can look at the ones that are in English and sometimes it will have a title such as "Free Website Builder | Create a Free Website | WIX.com". Copy all the ones that clearly state in the title that they offer a free Web 2.0 service and paste them in a notepad of confirmed sites. For all the ones that are in a language you don't understand, make a column to the right of the domain column (should be A). In the first row of the new column you just created (should be B1), put in the following -
    Code:
    "https://translate.google.com/translate?hl=en&u=" & A1
    Then go ahead and drag that cell all the way down to the last row of the ones in the language you don't understand. This will give you a link to each of them that will show you the site in English.

    Step 8:
    Now that you have a list of URLs you can easily understand, you want to check each one to see if they offer a free Web 2.0 service. This is going to take a long time to do manually, so we're going to want to use a service that will take a screen shot of each website. I used Grab Them All, a plugin for Firefox (I am not affiliated in any way). You can find it at
    Code:
    https://addons.mozilla.org/en-US/firefox/addon/grab-them-all/
    . Once you have screenshots of each website go through them and start sorting through them to find ones that match what we're looking for. I made three folders, one folder for confirmed, one for sites that aren't what we're looking for, and one for sites that will need a manual review. I then just put each picture in the folder they belong to. Once you have a folder that has all the screen shots of sites that are confirmed what we're looking for, you can put all the websites in a spreadsheet. This will be relatively easy because the image names are the websites. Then you can visit the sites need a manual review and decide those for yourself.

    Step 9:
    Now you should have a list of confirmed free Web 2.0 services that offer a free subdomain. What you want to do next is see which ones are do-follow. This step takes the longest. You want to put all of them back into a spreadsheet (for this we'll say you put them all in column A) and then in column B row 1 we will put..
    Code:
    "https://www.google.com/search?q=site%3A*." & A1 & "+download"
    Then take that cell and drag it down to be done for all of them. This gives you a list of urls that are searching for subdomains on each service which have the word download. Usually if a site uses the word download they gave an outgoing link.

    Step 10:
    Paste each url in your browser, click on a result that seems like it has an outgoing link, then check each one to see if the site contains no-follow or do-follow links. Note, you have to check links that point to a different domain than the one you're on because Web 2.0 services will always have do-follow for links that point to their own domain. I right clicked on each link and did inspect element using Chrome, but if this is not an option I'm sure you can find a way to check.
    To make this process a little bit easier I pasted all the urls into a site called URL Opener (I am in no way affiliated) and it gave me clickable links for each one, so I could just CTRL+Click to open it in a new tab.
    You can find the site at
    Code:
    http://www.urlopener.com/homepage.html
    If you follow each step you should have a unique list of great do-follow Web 2.0 services that offer free domains! As you can see this is a lot of work and will take a long time.

    Bonus:

    I checked a list of all the sites that participated in the Sopa blackout (80,000 sites+) as this had a lot of personal sites on subdomains and used a javascript script to find ones that had subdomains and write the websites out. This was a hit or miss but it gave me a lot of extra sites.

    Feel free to ask any questions or post about how you could automate some of the steps!

    Hope you all find this useful!
     
    • Thanks Thanks x 14
  2. chocobon

    chocobon Junior Member

    Joined:
    Aug 9, 2015
    Messages:
    104
    Likes Received:
    16
    thanks for the advice
     
    Last edited: May 5, 2016
  3. REEND

    REEND Newbie

    Joined:
    May 31, 2015
    Messages:
    30
    Likes Received:
    1
    Thanks for the guide man! Thread bookmarked!
     
  4. Crazy Monkey

    Crazy Monkey Jr. VIP Jr. VIP

    Joined:
    Aug 4, 2015
    Messages:
    1,951
    Likes Received:
    239
    Gender:
    Male
    Location:
    In Jungle
    Nice guide, Thanks for sharing information
     
  5. Chiefjop

    Chiefjop Jr. VIP Jr. VIP

    Joined:
    Mar 25, 2013
    Messages:
    570
    Likes Received:
    104
    Thanks! Great share :)
     
  6. chocobon

    chocobon Junior Member

    Joined:
    Aug 9, 2015
    Messages:
    104
    Likes Received:
    16
    on the sub domain must put keywords? and on the blog, a link to our site with keywords?
     
  7. abhi007

    abhi007 Jr. VIP Jr. VIP

    Joined:
    Aug 31, 2010
    Messages:
    5,700
    Likes Received:
    3,883
    Location:
    Theatre of dreams :)
    Bookmarked for future use.
     
  8. dhihi

    dhihi Junior Member

    Joined:
    Dec 9, 2010
    Messages:
    121
    Likes Received:
    44
    Quality post OP!
     
  9. Jesse Custer

    Jesse Custer Jr. VIP Jr. VIP Premium Member

    Joined:
    Apr 4, 2015
    Messages:
    351
    Likes Received:
    180
    Great guide and I learned some new things. You could export the URLS to scrapebox (or get somebody to do it) and it has a free "********" checker.

    I think this process would be easy to automate in scrapebox - though I'm far from an expert. Just need the right footprints to find the sites and then remove the duplicate top level domains.

    Thanks also for sending me the list. You're really sharing some great stuff.
     
  10. SEO Panda

    SEO Panda Registered Member

    Joined:
    Apr 14, 2016
    Messages:
    57
    Likes Received:
    13
    Great share, Thanks buddy