1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Outranking An Entrenched Site

Discussion in 'Black Hat SEO' started by tygrus, Aug 31, 2010.

  1. tygrus

    tygrus Supreme Member

    Joined:
    Mar 28, 2009
    Messages:
    1,237
    Likes Received:
    827
    Occupation:
    Engineer
    Location:
    Canada
    I recently came across what I think is an opportunity to out rank an major site. Details below.

    searches per month = 246,000
    PR = 6 (main page)
    Backinks = 18k
    domain backlinks = 18k
    keyword in domain = nope
    keyword in subdomain = nope
    keyword in title = nope
    keyword in h1 tag = nope.

    They have high PR and lots of links, but because they didn't do any on page SEO, I am thinking I could outrank them using an established web 2.0 property like hubpages if I kept my on page SEO tight. I think even a fraction of the backlinks would be needed as google would favor the on page SEO efforts more. Any thoughts?
     
  2. accelerator_dd

    accelerator_dd Jr. VIP Jr. VIP Premium Member

    Joined:
    May 14, 2010
    Messages:
    2,441
    Likes Received:
    1,005
    Occupation:
    SEO
    Location:
    IM Wonderland
    Check for which keywords is is optimized. Having a brand as a domain etc is a good technique. Also check the PR of the backlinks it's getting, anchor text of backlinks?

    Also if it's a linkwheel in question, check what backlinks does IT have also. Get the bigger picture before proceeding.
     
  3. tygrus

    tygrus Supreme Member

    Joined:
    Mar 28, 2009
    Messages:
    1,237
    Likes Received:
    827
    Occupation:
    Engineer
    Location:
    Canada
    Maybe I can explain the competitor site a bit. If you were searching for this, you would enter 3 search terms for it. It basically has a 3 word title that is unmistakable, thats where the 246,000 searches are based. however, the creators of the site chose to shorten things and just register the first searched word as the domain and do no other on page SEO.

    Here is an example. If you were searching for say Madison Square Gardens, thats what you would enter, but the owner chose to register madison.com only. So it basically does not have the entire most searched keyword in the domain - its only partial.
     
  4. deathclick

    deathclick Junior Member

    Joined:
    Aug 9, 2010
    Messages:
    131
    Likes Received:
    55
    Location:
    San Jose, California
    Compared to some entrenched authority-type sites, this one looks doable. It still isn't going to be easy, especially if the site's backlinks are high quality and old. Your idea of better onpage helps for sure, but it all depends on the competition, and it would surprise me if tight onpage could trump 18K backlinks. I always like to assess the feasibility of getting a large portion of the backlinks held by the targeted website. If it looks easy or the links are social bookmarking crap, then outranking may be well within reach. If the site's backlinks are good ones, it is hard for the onpage to overcome that by itself.
     
  5. tygrus

    tygrus Supreme Member

    Joined:
    Mar 28, 2009
    Messages:
    1,237
    Likes Received:
    827
    Occupation:
    Engineer
    Location:
    Canada
    Whats a good tool for mass analyzing the sites backlinks?
     
  6. deathclick

    deathclick Junior Member

    Joined:
    Aug 9, 2010
    Messages:
    131
    Likes Received:
    55
    Location:
    San Jose, California
    If you are looking for free tools, my favorite is the SeoQuake plug-in for Firefox. This tool will tell you a lot of information about the links on page one of Google and give you a sense of what you would be up against to duplicate or overcome them. The high PR links are the interesting ones. The probably thousands of PR0-PR1 blog links do not need to be exactly duplicated, they can be overcome with scrapebox, senuke or xrumer, in increasing order of power. Be careful with these tools--you could sandbox yourself for a long time. The only way to stop the sandbox is to get the corresponding large traffic to your site that justifies all the inbound links. It is best to ramp up linking with some compelling content, so the clicking on the site ramps correspondingly with the link creation. I have avoided the sandbox most of the time by providing Google the traffic to justify my links. Aged suitable domain names with a record of traffic can also help with this.

    If you want to pay for a service, there is the Majestic SEO backlink analysis program. They crawl the web themselves and find everything and report it all--in contrast to Google and even Yahoo, which while it shows a lot, doesn't report/crawl everything and it is becoming more Bing-like every day. The trouble with Majestic is that it has trouble distinguishing links to domains and subdomains that are hosted on shared IPs, over-reporting the count of links that are really pointing at other domains. So even this tool may have issues with your particular target and it can make it difficult to justify paying for it. It is not always a problem, but this issue continues to dog them.