1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Components of Googles Ranking Algorithm in 2010 - Linking Still King

Discussion in 'White Hat SEO' started by madoctopus, Nov 15, 2010.

  1. madoctopus

    madoctopus Supreme Member

    Joined:
    Apr 4, 2010
    Messages:
    1,249
    Likes Received:
    3,498
    Occupation:
    Full time IM
    I was reading something you may find useful:
    Components of Googles Ranking Algorithm in 2010- Linking Still King

    Would be cool to post here your own thoughts regarding the factors and how much you think each weights in Google's algorithm.

    Personally I would make it like this (these are 100% my own opinions and are not necesarily backed up by others or by hard data/tests, other than my own subjective observations):

    Trust/authority of host domain: 35%
    From the factors that determine domain authority i would mention: number and quality of links, topic coverage of link sources, IP & class C distribution (number of different class C IPs divided by total number of IPs), number of IPs divided by number of domains.

    Link popularity of the specific page: 20%
    Here I would say we have to consider both quantity and especially quality.

    Anchor text of external links to the page: 15%
    Varies, depending on quality of links and convergence (percentage of links with same/similar/subset/superset words from number of all links - higher convergence means that Google can trust more that the page is about that particular keyword).

    Anchor text of internal links to the page: 5%
    Could increase if domain trust and topic coverage are high and/or if not enough external links have useful anchor text to determine what the page is about, or if the external links are low quality (not to be trusted much).

    Topic coverage: 20%
    By topic coverage I am referring to how much content/information about the keyword (topic of interest), is available on a site. Basically a site with 10 pages about "weight loss" and 10000 pages about various other topicss (mostly unrelated) will have a much lower topic coverage rank than a site with 1000 pages about "weight loss" and 10 pages about various other topics. Basically this indicates how much of a "specialist" the site is on a topic, much like comparing a butcher that has a hobby about quantum mechanics to a physics researcher working at CERN on the topic of quantum mechanics.

    On-page keyword usage: 5%, function of domain trust, topic coverage, anchor text similarity
    I think this is a variable weight factor. If domain trust (and several other factors) is high, Google will "trust" your site more, hence trust that you used the keyword on the page in a useful manner for the visitor, not only to try to rank better (spammy). Because of this, in practice, the influence of this factor becomes a tiebreaker for results of similar rank (based on other factors, primarily links, topic coverage, anchor text).

    Registrar & host data: critical
    This is critical because it functions in a pass/fail way. It is used as part of the anti-spam algorithm to determine if a site is a spam result. If it passes the test, the other factors will be used to rank the site. If it fails the test, it will receive a ranking "penalty" (rank significantly lower, or don't rank at all for that keyword or all keywords for which there are results that are clearly not spam). Obviously, this factor is not the only one used in a pass/fail way. Link graph analysis, linking patterns, content patterns, etc. are also used as anti-spam factors.

    Traffic + CTR data: low, tiebreaker
    I see traffic stats as difficult to use reliably as a ranking factor. A domain with a lot of traffic (based on Google Analytics) doesn't necessarily have a better coverage for a given topic than a domain with lower traffic.

    SERP CTR data on the other hand I think can be used as an indicator regarding the quality of the SERP excerpt (how appealing it seems to the searcher), hence it could have a low impact in the ranking algorithm. However, I think this is used more in a tiebreaker approach - if 2 results are have the other factors similar, Google will use this data to pick one of them as being a better answer to the searcher.

    Social graph metrics: i have no idea
    This is not my thing. Never did social optimization so I don't have an opinion about it, other than the fact it has much lower importance than the other factors above. However, it makes sense to a certain extent for Google to take into consideration, even if just a bit, mentions of your domain name, a URL from your site that is not a clickable hyperlink, mentions of your brand name, etc.

    ---
    An important note I want to make is that I truly believe the algorithm is not linear. That means, the factors are not taken separately, but they are a function of other factors. In layman's terms it goes something like "If F1 factor meets F1c1 criteria, apply function f1(F1), else if meets F1c2 criteria apply f2(F1), else if f3(F1,F2,F3) = X apply f4(F1)" - ok, maybe not that clear, but read it several times and you should understand what I mean.

    Some of the factors that are usually taken together to compute an intermediary result are those of same type (e.g. all factors that have to do with links - number, anchor text, IP distribution), but it also makes sense for "unrelated" factors to be coupled (e.g. giving more weight to the final result (the SERP) if there is a very clear concordance between multiple factors (links anchor texts, on page keywords, title tag keywords)).

    Feel free to give reputation or say thanks if this helps, but most importantly let us know your thoughts. I think understanding the factors as much as possible and how they interact with each other is fundamental to be a good SEO and not just average.
     
    • Thanks Thanks x 8
  2. peter73

    peter73 Regular Member

    Joined:
    Jun 27, 2010
    Messages:
    341
    Likes Received:
    90
    Location:
    X marks the spot----------------------------X
    Got a nice read from the link. Cheers
     
  3. deviatus

    deviatus Power Member

    Joined:
    May 25, 2007
    Messages:
    517
    Likes Received:
    387
    Registrar & host data: critical
    This is critical because it functions in a pass/fail way. It is used as part of the anti-spam algorithm to determine if a site is a spam result. If it passes the test, the other factors will be used to rank the site. If it fails the test, it will receive a ranking "penalty" (rank significantly lower, or don't rank at all for that keyword or all keywords for which there are results that are clearly not spam). Obviously, this factor is not the only one used in a pass/fail way. Link graph analysis, linking patterns, content patterns, etc. are also used as anti-spam factors.


    What is this suppose to mean? There are blacklisted hosts and registrars out there? Certain IP ranges get you penalty?

    G usually claims where it's hosted and registered doesn't matter.
     
  4. madoctopus

    madoctopus Supreme Member

    Joined:
    Apr 4, 2010
    Messages:
    1,249
    Likes Received:
    3,498
    Occupation:
    Full time IM
    No. That is not what I meant. What I meant was that host and registrar data is used to find link networks. Basically, if you have let's say 100 sites on the same IP and all are interlinked and are autoblogs or low quality, Google will see them as spammy sites and "penalize" the whole network.
     
  5. deviatus

    deviatus Power Member

    Joined:
    May 25, 2007
    Messages:
    517
    Likes Received:
    387
    Alright.

    What about having the same or similar backlink, plugin, and theme footprints, what's your opinion on that? And reg info? I always use private, but I keep hearing how this info can be compromised. I don't think it's an issue now, but some G patents suggest they are looking into things like this.
     
    Last edited: Nov 15, 2010
  6. Pb.com

    Pb.com Registered Member

    Joined:
    Sep 10, 2010
    Messages:
    92
    Likes Received:
    11
    Occupation:
    SEO Consultant and Software Engineer
    Location:
    U.S.
    You're missing a critical part: Correct coding, speed, and server location. Valid (meaning W3C valid) coding and speed often makes or breaks websites. I've seen sites jump up from the third to first page on a competitive keyword ("buy [car brand]") in less than a week after the changes (no other changes done).

    Server location: an obvious factor. If google thinks your site is based out of Australia, there is a good chance they'll think it's geared towards Australians.
     
  7. deviatus

    deviatus Power Member

    Joined:
    May 25, 2007
    Messages:
    517
    Likes Received:
    387
    G's Matt Cutts says this isn't a factor.

    I know sites hosted on the shadiest Russian servers, that are top ranking for terms in the US.
     
  8. efwebs

    efwebs Regular Member

    Joined:
    Aug 9, 2010
    Messages:
    424
    Likes Received:
    137
    Home Page:
    Good information - but I would keep in mind that is it based on a survey of people not related to Google, so it's still speculation.
     
  9. groggi42

    groggi42 Newbie

    Joined:
    Nov 13, 2010
    Messages:
    22
    Likes Received:
    4
    That's interesting. So why is there all that SEO software checking if you're in a bad neighborhood?
     
  10. richcamp

    richcamp Regular Member

    Joined:
    Oct 5, 2009
    Messages:
    315
    Likes Received:
    119
    Another one I would add is searcher histories. Google has millions and millions of users (us) continuously using (testing) its search engine. Coupled with bounce rate from analytics, it is not too far off for google to downgrade a site that has good standing in serp but not attracting genuine interest from users.
     
  11. deviatus

    deviatus Power Member

    Joined:
    May 25, 2007
    Messages:
    517
    Likes Received:
    387
    Paranoia. For now. Unless this is about checking your backlinks neighborhoods (pages with tons of spammy links on them), which might devalue them.

    The way G sees it, if they take action on a server or ip address, the people with spammy websites that are savvy will just migrate, and if it's shared hosting, the other sites on the server which might be good are punished.

    Cutts talks about it in a youtube video.

    Code:
    watch?v=AsSwqo16C8s
     
    Last edited: Nov 15, 2010
  12. ThunderC

    ThunderC Power Member

    Joined:
    Jul 6, 2010
    Messages:
    697
    Likes Received:
    208
    Occupation:
    Adult Webmaster
    Some interesting info, the question is, is it true, or just something to stop us from assuming other things...
     
  13. tihiq

    tihiq Newbie

    Joined:
    Nov 2, 2008
    Messages:
    35
    Likes Received:
    16
    Basically we need quality content published on old domain and linked a lot. It's not a rocket science :)
    I have a question about the "Registrar & host data: critical". What does pass/fail depend on?
     
  14. ipopbb

    ipopbb Power Member

    Joined:
    Feb 24, 2008
    Messages:
    626
    Likes Received:
    844
    Occupation:
    SEO & Innovative Programming
    Location:
    Seattle
    Home Page:
    This is the results of a survey of what SEO people believe influences SERPs. It's not empirical analysis of what influences the Google SERPs. It is a measure of what most white hat SEO consultants are going to speak to.

    I like to see trends... some of what they mention shows good trend lines individually. Some is too subjective or poorly defined to even measure. The weights reflect acceptance of a belief across a very small sample.
     
  15. Jesperj

    Jesperj Power Member

    Joined:
    Sep 10, 2010
    Messages:
    502
    Likes Received:
    347
    Occupation:
    Web Designer
    Location:
    Far, Far away
    Home Page:
    Great now i just gotta stop spamming useless pr0 blogs. But im still confused about the domain PR, and url PR. if i post to a blog with domain PR of 5, will the link then count as that? or will it solely rely on the page PR that im posting on.

    Either way ill start filtering away all the junk blogs :) see if it makes a difference
     
  16. efwebs

    efwebs Regular Member

    Joined:
    Aug 9, 2010
    Messages:
    424
    Likes Received:
    137
    Home Page:
    You get credit for the page PR, not the domain or homepage PR. But, a single page PR will be higher for a site with a high PR than a site with a low PR.

    It's also weighted differently for comment vs. content vs. sidebar links.
     
  17. thermapower

    thermapower Junior Member

    Joined:
    Dec 26, 2009
    Messages:
    183
    Likes Received:
    32
    Home Page:
    Good information:)