I put this here apros-po of nothing. None of the information below means that certain sites/methods and activities can't "buck the trend" I know for a fact that many do. If being told that content is becoming more important offends you, then don't read it. I already know that some have systems that work with what might be termed "poor" content. Great - seriously - great, well done. But the measurable fact that for many it does not is "writ large" on this and other SEO forums every day. If you have a system that works for you, be it a content, website, social, linking WHATEVER method...then stick to it - please don't troll this thread telling me what does or doesn't in imperical terms. These are GENERAL and NONE SPECIFIC metrics. Printed here to help people who may be undecided/ They are not for everyone, and they will not fit into everyone's IM or online money making plans > I ALREADY KNOW THAT. But if you are starting from scratch, and just want an idea of which way the (imperfect) wind is blowing. Then this might be of some help. If not - feel free to ignore it. I will ignore trolls. The conclusions at the bottom are general and very few of them are made. Draw your own conclusions, ignore if you like. It's "No skin off my nose" First published on demondemon dot com (i've removed all links - if you spot any let me know and I'll take them out) ----------------------------------------------------------------------------------- There's an old saying, "Don't shoot the messenger". It came straight to mind when looked at the latest data update from SEO metrics. Since 2010 a loose team, (myself included on the periphery), have been collecting URL's and categorising them according to various criteria. Just about everything that could be categorised on both online and off-line SEO has been looked at. Google Ranking Factors. SEO Correlation To Ranking Success 2012/13 Then each of these URLs (and there are currently well over 15 million of them across several hundred thousand sites) are then checked against several of the keywords that they could potentially rank for and the success (of failure) to rank is mapped against the specific features that the URL possesses. For this to be of any real value the sample size has to be huge, and it is. The downside is that, being a big and time consuming process, the updates only come roughly once per quarter. It needs to be done regularly to keep up with seasonal trends What this allows is a good guess as to which criteria, methods and systems are currently catching the search engines eyes and are having a positive effect or a negative effect on your site's ranking. In SEO, Everything Is Relative What must be understood first is that these are relative factors. There is nothing absolute about this at all. We don't know where the start point is, or indeed where the endpoint is. The graph below shows some of the key performance indicators that can be extrapolated. This is a small sample of the hundreds that are actually calculated. Most f them shown are meta groups and calculated by extrapolating the data from many other smaller criteria. I'll point out in the descriptions below where this is the case. Anything to the left of the 0% line means that the correlation between that element and an increased or sustained search engine ranking is in decline. Any line to the right means that sites with those elements have shown a propensity to increase that particular URL in search engine ranking. The size of the data set suggests that this is a pretty good analysis. But as I mentioned in the opening paragraph I know some will vehemently disagree particularly, if any firm conclusions are drawn. So at the end I'll just add some very basic ideas that stand out and let you draw your own conclusions from the data. Benefit of EMD for a Search term First of all, to clarify, this particular element is quite specific to commercial terms. Product names, buying words or generic service descriptions. It won't come as a surprise to see that their influence in positive ranking has declined slightly. What may be a surprise is that the decline isn't as great as some people seem to have suggested over the last few weeks. Bear in mind that this data comes well after the so-called EMD slap. It is worth mentioning that the suggestion is that the reduction in the influence of an exact match domain is still in decline, in which case the next quarter's update will also show a decrease in the correlation. Title Character length > 48 characters A strange one. This is specifically the last canonical title at whatever depth the URL is filed behind the root domain. It does not take the entire domain name into account its just the actual specific page title. As such I'm surprised to find many that have a character length of 48 and not particularly surprised to find that this is not favourable. The fact that in the last three months it has taken a little bit of a hit might raise the odd eyebrow but overall not a statistic with any particular significance nor one that is likely to affect many people. Adsense blocks in first 720 pixels Here's one that came right out of the leftfield. I had no idea this was even being measured. An adsense block that begins in the first 720 vertical pixels of any given URL seems to have a dramatic effect on that page's ability to rank for its chosen keywords.It has an unfavourable impact far greater than the exact match domain rebalancing. By factor of two or three. I can only speculate that given that many exact match domains are made for adsense sites and therefore have the commercial block prominent in the top half of the page that the reduction in ranking that some sites have reported and attributed to the EMD recalc may actually have been mis-diagnosed and in fact it is the placement of the advertising block that has caused the issue. As somebody who doesn't use this form of advertising any more it's not my area of expertise but given that it's the biggest relative change in performance metric over last three months it's very surprising that it hasn't been picked up. Given that adsense is Google's own network, it seems they are looking to keep their value (no surprise) and stay as white hat in terms of SEO as possible. At least in this regard. Is This An Over Commercialization Penalty? Or Maybe A Specific MFA Site Slap? I'm going to say no more about it and perhaps leave this for others who are far better equipped and understand Google advertising to look at this. Keyword in Title and H1 Tag Two perennial favourites. Some of the deeper data shows that the hidden meta data that used to be so important in establishing a site's credentials in a particular niche is now virtually worthless. Interestingly on page elements such as titles and header tags have picked up some of the slack here in terms of establishing content and context. Not a big shift but one which sensibly puts the ability to establish these important metrics into the hands of the content creator rather than the technical website designer or HTML coder. This points to content and context and the viability of any particular URL to compete within any particular niche is now derived statistic rather than one that Google simply fetches from a sites hidden meta-data % External/Internal backlinks with stop word I bunch these two together here as well. The use of stop words as part of your anchor text on both incoming and inter-site links has had a positive effect recently. From where I'm sat it is almost certainly a consequence of people diluting their keyword orientated anchor text to one which satisfies Penguins need to see a more diverse set of anchor texts used Word count >130 <3000 (variable sweet spot) Keeping your word count above 130 and below 3000 bestows a small relative benefit. It might encourage those with very little content to pad out to get to the minimum, although padding to get 130 should be that much of an effort. It also tells those who have produce huge amounts of content to perhaps split it up to avoid going over 3000 words. There is a variable sweet spot that seems to relate these were numbers to the commerciality of the niche website is in. Obvious product review sites can indeed get away with 130 words, although you often see many with far fewer words than that per page/product. Whereas sites have reports of a technical nature or discussion pieces and a slightly less commercial tone should aim higher in the spectrum. Boy am I glad that's the case Word count (Backlink anchors < 0.4% of total word count) Another way of saying this is to have no more than one outbound link (even if it's to your own site) per 250 words. I would hazard a guess that this is to show that any particular article or page is designed to offer value to the visitor in its own right rather than just being a conduit to other URLs. So although this is a small shift, it's probably best not to "over link" on any particular page. Image Count > 0 Pretty simple really. Have a picture. Again it's not a huge benefit that it is a measurable one, and they all add up. An omission here is the inclusion of videos. There are some KPI's that are being looked at for video but they are separated into self hosted, professionally hosted, and those held by commercial video sites such as YouTube. The overall data regarding any benefit that having a video on page as is a little bit confused by these various ways of hosting them. Hopefully in a later post I'll be able to sort out what it all means and perhaps give some insight. But for now the simple issue is that images seem good, video seems good. But we all knew that didn't we. % Backlinks with keyword (>1 <33%) Have at least one incoming back link with the keyword you want to rank for. The thought that springs to my mind here is that one is probably nowhere near enough but it does depend on your niche. But importantly don't have more than 33% of your back links with the same keyword anchor anywhere in it. Be aware of composite anchor text that includes many words and repeating the same word in several different anchor text variations Each one will count. It's a mistake I see made time and time again when I'm asked to appraise peoples backlink profile. They have variations of the same word occurs in every variation. Then it's time to say hello to Mr Penguin. % Backlinks rel = Nofollow (>1 <=50%) Having nofollow links confers a measurable benefit. The benefit that they confer has increased over the last three months. This is where I feel like getting my tin hat on and hiding in a bunker.There are those on the Internet, many of whom I respect, who consider nofollow links to be the very spawn of Satan. They are not, in fact having a decent percentage of them actually confers a benefit to your website. That benefit has increased recently and will likely stay at helpful level. That's not to say you can't have over 80%+ do follow links, that would be fine where is 80% no follow would not be. A 50-50 split or split - or having a split slightly in favour of ******** would seem to be absolutely best SEO practice. I didn't design these tests, it's just the way it is, and it follows on from similar studies done recently. Nofollow links help, not in passing Page rank but they certainly do as regards search engine ranking and at the end of the day that's all that really matters. I'll say no more on the matter. Word Context (>95% language appropriate, >12% keyword context) I would have preferred it if this indicator was split into 2 to see how they measure up against each other. The 95% language appropriateness means that if you're writing in a given language then Google or Bing will expect 95% of the words in any page or article to be recognised in its dictionary of that language. This means that using foreign words, new products or service names that Google has not indexed yet and of course spelling mistakes may count against you if there are too many of them. This all helps with establishing good latent semantic indexing for your URL's as well. %age keyword context is an interesting one and something I would like to see extrapolated in its own metric. It seems search engines understand the blocks of words and types of phrases that should occur in any particular niche. It would expect to see a broad range of these within any article to establish that it is indeed an article specific to that niche. In other words your other on page content needs to establish context. In practice means that about one word per sentence on average should have something to do with the topic. So, for example, if you're talking about mobile phones Google may expect to see things like "recharging", "3G", "cell phone", "Orange", "Ice cream sandwich", "Ios", "Apple" etc...included within the written work. If these references (and this does not include stop words) fall below 12% then you are failing to establish context and the SE might come to the conclusion that the written work may not be specific enough for the keyword that is being targeted to be to rank highly. Again this is nothing particularly new. Twitter/Facebook/Bookmark (URL in Tweet, Retweet or "Like") This is a slightly wider category and includes all the major bookmarking, micro-blogging and social media sites in one. It is the third biggest increase over last three months, and it was already pretty large before that. Social noise is important and has becoming increasingly so as time goes on. Authenticity (% unique 3-5 word phrases >60%) I wish I understood this better. Maybe somebody who understands language and the way that Google works might be able to tell me exactly what this means. It is something to do with the uniqueness of the work and the way that Google parses text in 3 to 5 word blocks or phrases. Once again there are caveats regarding the inclusion or exclusion of stop words and the way in which these blocks are put together. My shorthand version would be; "Unique content is better". But I know there are some real experts out there who may well be able to give me and my readers are much better overview of what this actually means. Full disclosure here - I'm not really sure myself. I'll add that this does not preclude spinning and re-purposing of content. Spinning well and keeping your work readable and contextual seems fine. However it must appear unique when published and meet the language and context rules here as well to add real value to your SEO campaign Number of inbound backlinks (criteria apply*) Another surprise. The value of links has increased. Given that many links are now being ignored by the search engines the relative value of those that remain has gone up. The idea of making hundreds of thousands of them is being devalued where the idea of making a few hundred good ones has never been better. The criteria alluded to in parenthesis involves content quality, keyword density, context and anchor text primarily. In other words many of the other metrics that were measured above. So What Does This Mean For SEO Going Into 2013 I deliberately don't want to draw any specific conclusions. But in general a few obvious points seem to jump out (and I'll stand corrected if anyone has a better interpretation) Content continues to push hard and is increasingly important. (though still secondary to links) Spammy links and over linking has a decreased value and is a trend that has now been consistently downwards for a couple of years Quality Links - especially links surrounded by good quality and contextual content are more important than ever. Add to the mix social noise and you're probably onto a winner. The main danger signal here seems to be that over commercialisation of sites has taken a real hit, and this is snuck in under the radar. The adsense block reduction has come completely as a surprise to me and coupled with the decrease in benefits of exact match domains seems to point quite definitely at sites that offer little value but high commercialisation. Whether you think I'm attacking "made for adsense" sites or not is neither here nor there. I'm not, MFA is not something I do any more, and hopefully we might get some comment from those still working in that area. As for the rest of it? I'll let you draw your own conclusions.