1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Free High PR4-9 Backlinks

Discussion in 'White Hat SEO' started by PriorityMarketers, Feb 4, 2010.

  1. PriorityMarketers

    PriorityMarketers Newbie

    Joined:
    Jan 7, 2010
    Messages:
    13
    Likes Received:
    46
    No, you didn't read the title wrong, I've figured out a few tricks on how to get very high PR links. These are not nofollow links, they are 100% follow, but there's definitely a trick to it. I was really debating about rather or not to share this technique, but hell I've gotten so much out of BHW that I thought it only fair to try and make my own contributions.

    On to the show, here's how you get your High PR one way backlink (this one is PR7 with your anchor text). You will find that it's very much like StatBlaster's technique, but with a twist allowing you to get targeted backlinks instead of the standard WhoIs links that everyone can (and does) get.

    Just take this URL:

    Code:
    http://valet.webthing.com/link/link.cgi?url=http%3A%2F%2F[TLD]&type=Brief&spider=1
    
    Replace [TLD] with your full domain, so in my case it will be http://www.prioritymarketers.com (my site that I've never finished developing).

    Code:
    http://valet.webthing.com/link/link.cgi?url=http%3A%2F%2Fwww.prioritymarketers.com&type=Brief&spider=1
    
    This page will spider your entire site, and make a list of all of links it finds, and list them using your page title as it's anchor text.

    Now, the trick is to get that page indexed, since this page isn't linked to from anywhere it wouldn't normally be indexed. So what you need to do is link to it from a page that you know google will crawl. Somewhere like yahoo answers, squidoo, any of the social bookmarking sites... You get the idea, just take your pick.

    Once you've you linked it, ping the page you linked it from (the yahoo answers page, squidoo page, whatever).

    There are tons of other sites that allow the same thing, some of them even will give you links using your title as the keyword, or spider your page and put the keywords on the same page as your links. If there's enough interest, I'm thinking about writing a tool that would incorporate this technique (and probably angela's backlinks, statblasters, and a few more principles rolled in to one).

    If anyone wants to add to this list please pm me your additions, and I'll add it to the list. In general, to find these types of sites you want to find: online link checkers, online spider sites (IE: spider my site), rage rank sites (they often have a link to the page their checking), Meta tag generators (often times they don't block html, and show you a preview which can have your backlink in it), HTML previewers, you get the idea. Think outside the box. The thing is most of these pages don't show the full url in the address bar when it's displaying the data, but so long as it's a post back (IE you have to submit a form and the page reloads) you can usually get this techniaue to work.

    So if you have any links of pages that you've found that has this type of data, send me a pm and I'll see if I can extract a usable link from the page that can be indexed. However, please read this entire post, as at the bottom of the post I've posted some of the most common sites that don't work with the technique. Also bear in mind that you have to check the robots.txt page, as some will disallow the robots from indexing their generated pages, making your efforts fruitless.

    I've compiled a list of a few more sites below. They've been filtered to make sure they are all Follow links, and not filtered by robots.txt file. For your reference, the [LURL] is full url without the http:// (http://www.domain.com/folder/page.html), [TLD] is just your top level domain (http://www.domain.com) and
     
    • Thanks Thanks x 24
    Last edited: Feb 4, 2010
  2. twix70

    twix70 Regular Member

    Joined:
    Jan 1, 2009
    Messages:
    357
    Likes Received:
    54
    Nice share, not only giving some sites but what should be more important to most, the method to find them. Black hats off to you!
     
  3. peter2002

    peter2002 Senior Member

    Joined:
    Jul 8, 2009
    Messages:
    1,120
    Likes Received:
    24,314
    Occupation:
    Internet Marketer, Degrees in Business and Psychol
    Location:
    USA
    Sounds interesting. I am going to check this out with one of my domains.
     
  4. xenoxen

    xenoxen Jr. VIP Jr. VIP

    Joined:
    Jul 22, 2009
    Messages:
    801
    Likes Received:
    187
    Occupation:
    online.
    Location:
    Europe
    Home Page:
    really nice.. I got error because of www and http://site error. need to fix this :/
     
  5. PriorityMarketers

    PriorityMarketers Newbie

    Joined:
    Jan 7, 2010
    Messages:
    13
    Likes Received:
    46
    I poked around a bit more with all of them, and have an update, now you can put your anchor text in alot of them. Basically most sites don't escape html from being entered in their forms. So instead of entering just a url you enter something like:

    Code:
    http://www.domain.com">keyword</a>
    
    Believe it or not it actually works (or a variation of it) on alot of these sites. So below is an updated list.

    Make sure that if your LURL is just http://www.domain.com, that you put a forward slash (/) on the end of it, otherwise some links won't work right.

    Keyword anchor text:
    Code:
    http://valet.webthing.com/link/link.cgi?url=http%3A%2F%2F[TLD]&type=Brief&spider=1    //PR7 - shows link to all pages using title of page as the link!
    http://valet.webthing.com/link/link.cgi?url=http%3A%2F%2F[LURL]%22%3E[KEYWORD]%3C%2Fa%3E //PR7 keyword is anchor text
    http://www.guistuff.com/tools/navbar/assemble3.cgi?navlinks=[KEYWORD]+%7C+http%3A%2F%2F[LURL]&enterer=Create%21 //PR6 shows your keywords in the link text!
    http://www.freewebsubmission.com/cgi-bin/meta.cgi?description=%3Ca+href%3D%22http%3A%2F%2F[LURL]%22%3E[KEYWORD]%3C%2Fa%3E&submit=Generate+Tags    //PR6 your keywords is anchor text
    http://www.freewebsubmission.com/cgi-bin/metatag-analyze.cgi?url=http%3A%2F%2F[LURL]&Submit=Submit    //PR6 your page title is your anchor text
    http://www.xml-sitemaps.com/details-[TLD].html //PR6 contains a link to the next url, reality is this one isn't that great, but it's required to generate the sitemap below
    http://www.xml-sitemaps.com/download/[TLD]/sitemap.html?view=1 //PR6 Sitemap with page titles and everything!
    http://www.webrankinfo.com/english/tools/seo-toolbox.php?url=http%3A%2F%2F[LURL]%22%3E[KEYWORD]%3C%2Fa%3E    //PR6 Keyword is Anchor text
    http://www.freewebsubmission.com/cgi-bin/popular.cgi?url1=http%3A%2F%2F[LURL]%22%3E[KEYWORD]%3C%2Fa%3E    //PR6 Keyword is anchor text (it strips spaces from keyword though)
    http://www.webuildpages.com/seo-tools/spider-test/index.php?c=1&url=http%3A%2F%2F[LURL]%22%3E[KEYWORD]%3C%2Fa%3E    //pr5 
    http://www.webuildpages.com/seo-tools/keyword-density/index.php?&url=%22%2F%3E%3Ca+href%3Dhttp%3A%2F%2F[LURL]%3E[KEYWORD]%3C%2Fa%3E    //PR5 keyword is anchor text
    http://www.webmaster-toolkit.com/link-checker.shtml?url=http%3A%2F%2F%2522%3E%253C%2Fa%253E%3Ca+href%3D%2522http%3A%2F%2F[LURL]%2522%3E[KEYWORD]%253C%2Fa%253E&type=href    //pr5 Keyword is anchor text
    http://www.2bone.com/links/cgi-bin/2check.cgi?o=f&spider_url=http%3A%2F%2F[LURL]%22%3E[KEYWORD]%3C%2Fa%3E&o=s    //PR4 Show's links, no keywords
    http://www.indiabook.com/cgi-bin/check.pl?command=check&url=http%3A%2F%2F[LURL]%22%3E[KEYWORD]%3C%2Fa%3E&submit=Submit //pr4 Keyword is anchor text
    http://www.thedreamtime.com/cgi-bin/links/spider.cgi?URL=http%3A%2F%2F[LURL]    //PR4 - shows title in link! 
    http://www.sitesolutions.com/webtools.asp?F=check&url=[LURL]%22%3E[KEYWORD]%3C%2Fa%3E&submit1=Run+Link+Checker    //PR4 uses a link trick for url field to get keyword anchor
    http://www.gorank.com/analyze.php?url=http%3A%2F%2F[LURL]%3F%22%3E[KEYWORD]%3C%2Fa%3E&keyword=[KEYWORD]&Submit=Analyze+Page    //pr4 KEYWORD is anchor text
    http://www.seobench.com/search-engine-crawler-simulator/index.php?c=1&url=http%3A%2F%2F[LURL]%3F%22%3E[KEYWORD]%3C%2Fa%3E    //PR3 keyword is anchor text
    http://www.iwebtool.com/pagerank_checker?domain=http%3A%2F%2F[LURL]%3F%25%22%3E[KEYWORD]%3C%2Fa%3E    //PR3 keyword is anchor text
    
    Links with keywords somewhere on the page, but not in the text (these all worked on the anchor text trick, so you don't really need to use these unless you want keyword dense pages:
    Code:
    http://valet.webthing.com/link/link.cgi?url=http%3A%2F%2F[LURL]    //PR7 will index ALL of the pages, page titles are shown on the page, but not on the link text
    http://www.webuildpages.com/seo-tools/spider-test/index.php?c=1&url=http%3A%2F%2F[LURL]    //pr5 link keywords womewhere on page
    http://www.webuildpages.com/seo-tools/keyword-density/index.php?c=1&url=http%3A%2F%2F[LURL]&minlength=3&minoc=2&ikey=1&ides=1&stopwords=1 //pr5 link, keyword analysis
    http://www.gorank.com/analyze.php?url=http%3A%2F%2F[LURL]&keyword=[KEYWORD]&Submit=Analyze+Page    //pr4 shows main page, which keyword analysis and page titles on page
    
    Links that don't have keyword text anywhere on page and nowhere on anchor
    Code:
    http://validator.w3.org/checklink?uri=http%3A%2F%2F[LURL]&hide_type=all&depth=&check=Check#results1    //PR9 - W3C's link checker creates link to http://[LURL] only
    http://www.coffeecup.com/firefactor/tools/link_check?url=http%3A%2F%2F[LURL]&x=0&y=0 //PR9 - [LURL] is anchor text
    http://www.htmlhelp.com/cgi-bin/validate.cgi?url=http%3A%2F%2F[LURL]&spider=0        //PR7 will show link for http%3A%2F%2F[LURL] only
    http://htmlhelp.com/cgi-bin/validate.cgi?url=http://[TLD]&spider=1        //PR7 will spider entire site and show links for everything found
    http://www.onlinewebcheck.com/check.php?url=http%3A%2F%2F[LURL]&submittype=Check+Url    //PR5 - shows link, no keywords
    http://www.1-hit.com/all-in-one/php/tool-broken-link-finder.php?url=http%3A%2F%2F[LURL]  //PR5 will index links on page, no keywords
    http://www.axandra.com/free-online-seo-tool/broken-link-checker.php?url=http%3A%2F%2F[LURL]    //pr5 link no keywords
    http://centralops.net/co/DomainDossier.aspx?__VIEWSTATE=&addr=[URL]&x=9&y=7    //pr4 link no keywords
    http://www.ipaddresslocation.org/cgi-bin/wp.cgi?query=http%3A%2F%2F[LURL]&btnGo=Check+Ip+Address    //PR4 link to http://LURL
    http://www.v3whois.com/w/[URL]    //PR4
    http://www.v3whois.com/w/[TLD]/    //PR4
    http://www.v3whois.com/w/[TLD]/[KEYWORD]    //PR4  keyword adds your keyword to the url which should help ranking
    
     
    Last edited: Feb 5, 2010
  6. PriorityMarketers

    PriorityMarketers Newbie

    Joined:
    Jan 7, 2010
    Messages:
    13
    Likes Received:
    46
    Could you a bit more specific? Provide an example of a link that gave you an error please.
     
  7. Gozpy

    Gozpy Regular Member

    Joined:
    Nov 18, 2008
    Messages:
    244
    Likes Received:
    64
    Sadly most (I didnt have the time to check all) of those pages are disallowed in robots.txt or 'noindex'. So really you are just wasting people's time.
     
  8. Abstroose

    Abstroose Elite Member

    Joined:
    Nov 20, 2008
    Messages:
    2,097
    Likes Received:
    3,475
    Occupation:
    Thai Boxer
    Location:
    UK
    Home Page:
    Nice post, thanks for sharing the tip. I actually had an idea like this a while ago but didn't realise it was possible to get dynamic pages indexed.

    How can we check if the robots.txt allows or disallows dynamic pages to be crawled?
     
  9. PriorityMarketers

    PriorityMarketers Newbie

    Joined:
    Jan 7, 2010
    Messages:
    13
    Likes Received:
    46
    Take a look at that second list I posted, and tell me which ones are noindex or are blocked by robots.txt? I filtered out twice as many as made it on the list due to this very reason, so please due tell, cause I'm interested to know if there's something I'm missing.

    There are only a couple that I saw that were questionable. The validator.w3.0rg one I included because they didn't spell out the word they just used a generic /check instead of /checklink, from my experience some search engines won't count a "disallow: /check" as disallowing a "checklink?" url, because it doesn't include the * wildcard.

    Same thing applies for thedreamtime.c0m, in fact if you search google for inurl: you will find that they've indexed the supposedly disallowed pages.

    So seriously if there's a bunch I'm missing as you indicate, that means there's something I'm seriously missing about how robots.txt and the meta tags work, and I've done quite a bit of research in this area, so either post, or PM me the ones that I'm missing that are being blocked.
     
    Last edited: Feb 7, 2010
  10. PriorityMarketers

    PriorityMarketers Newbie

    Joined:
    Jan 7, 2010
    Messages:
    13
    Likes Received:
    46

    Check out: http://www.robotstxt.org for info on how to use robots.txt and meta robot tags.
     
  11. sachinkumar

    sachinkumar Newbie

    Joined:
    Oct 20, 2010
    Messages:
    5
    Likes Received:
    0
    little bit tuff