Background: A few weeks ago I had 11 different requests (9 from PM's on this forum) to appraise the SEO efforts of various websites. I won't name names or URL's but after spending several hours going through them all - this is what I found. Although 11 out of 11 "epic fails" is not normal (it's normally about 4 out of 5 sites that are fucking terrible) it was the point where I decided to put finger to keyboard and write about it. This is specifically aimed at people who think their website is FANTASTIC and any problem they have is with their backlinks. (yes - sometimes a problem can be with your backlinks - but are your really sure your website is not at least a major contributing factor to your poor performance) There is a chance I'll get flamed for some of this, apologies that it's all a bit generic, I really cannot name and shame, that would be very unfair. ----------------------------------------------------------------------------- Originally written by me and published a week or so ago on demondemon dot com - all links removed I hope. The weekend just gone has been one of the more frustrating ones as far as “getting work done” is concerned. Last week was an unusually busy week for people asking for advice regarding SEO or looking for ways forward to recovering their website after a perceived slap. The same phrases came up again and again. EMD or “exact match domain” levelling, Penguin, “over optimisation”, negative SEO. Between teatime on Friday and late on Sunday evening I looked at 11 websites in detail. I did my usual job of providing some pretty in-depth reports on all of them as well as their linking backgrounds. Six of these sites were for clients, and I was paid to do the job – and happy to do it. Five of them were for the forum or blog regulars here or at Demon SEO. Again, something I’m quite happy to do, or at least was. I’m now going to have to add some kind of filtering system because out of the 11 websites I looked at over the last 72 hours 100% of them had the same basic SEO mistakes on that any webmaster worth their salt should have spotted without needing to contact me or anybody else. All 11 websites with very thin on content. I’m skipping through the e-mails and forum private messages that were sent to me as I write this. Several of them are absolutely explicit in telling me that the website they want me to look at is an “A1”, “five-star”, “professionally SEO optimised” or a “visitor magnet” no less. I’m reassured in all of these e-mails and messages about the quality of the content, the variation, the previous linking history, the navigation, content tagging… you name it, I’m told Linking Easy – Site Content Hard – The Lazy SEO Blame Game “Take it for granted, onsite is all good – there is a problem, and the problem is with my links, please take a look at them” You know what? It was all rubbish. In all 11 cases it was the SITE that was at fault first and foremost and not the linking profiles. This follows a trend I’ve been seeing going back months where people assume links to be an issue where in fact it is their site at fault. But 11 in a row over one weekend was the “straw that broke the camels back” as far as I’m concerned, and prompted this post. Yes, if you’re one of those 11 people you received my report this morning and I have called your on-site content SEO “under-developed” or “in need of close examination or maybe even “lacking substance”. The truth is in all eleven cases the site content was noting short of RUBBISH. Poor content, none existent tagging, over commercialized, keyword stuffed, link spammed RUBBISH. Seven of the 11 were keyword stuffed. This is basic folks! A couple of cases there had been mitigating circumstances. Compound keywords had not been counted correctly or perhaps the author had not realized the effect of compound keywords and that tipped the balance over the 2.5% maximum per URL that good SEO should aim for. As an example. You’re optimising for the word “widget” in a 1000 word article. 2.5% keyword density means it should be mentioned no more than 25 times. Even this is on the high side. I found better results with the keyword density between 1.5 1.75%, but the sake of leniency here I've been marking on the upper tolerance of 2.5%. So scanning through this 1000 word article I find the word widget between 25 and 30 times. But I also find the phrases or words “widget-free” “uberwidget” “neowidget” or whatever a dozen times as well. Look folks, that contains the word widget. You’re 2.5% per-cent just became 4%. This content is keyword stuffed and the chances are that no amount of links is going to help it. In fact having aroused the search engines suspicion that you are self promoting by having keyword stuffing on page, they are likely going to be even more vigilant when it comes to checking your backlink profile. If you need to repeat your keyword or phrase multiple times then write more content. If you can’t be bothered to write more content then get it written for you. If you can’t be bothered to write more content or pay to get it written for you…then quit SEO. If you can’t even be bothered to get that in order then don’t go crying about links not working. Would you buy a stunning six bedroom house by the coast then cover it chipboard, flaking paint and dodgy wiring and then expect ”Stunning Homes” magazine to be interested in featuring it? The metaphor here is building your dwelling on shifting sands. Piling links onto a poorly developed, content starved, keyword stuffed site is going to get you nowhere, and yet 11 times out of 11 times this weekend this is exactly what I’ve seen. Copied Content. I’m always amazed that people claim to have 100% unique content. They tell me it’s handwritten. I run it through Copyscape and 9 of the 11 came up with a high degree of plagiarism. I take this one step further and check the dates that individual pieces of work were indexed. If yours is indexed first in Google then you’re probably all right. More often than not this has not been the case. With one or two people I spoke to in depth yesterday I found out that they did not in fact write the content themselves – they bought it. It read well, looked to be “proper” English, was on topic; so they used it. They did not perform any plagiarism check. On 5 of the 11 websites entire blocks of hundreds of words had been lifted directly from other websites. Some of which had had the article indexed many years before. It was not spun, it was perfectly readable but Google may not count it or give your site any value for links pointing to it. It had “0% uniqueness”. I felt a little sorry for one of the website owners, she had paid nearly $500 for 10,000 words, so that’s five dollars per hundred words – and a fair price to pay for a least half decent site copy. Every single word had just been lifted from other pre-indexed sources. Something she could have done herself in ten minutes if that was her intention. This isn’t just something that happens now and again, all five of those with blatantly copied work Some had paid considerable sums of money to authoring services and in every case what they had bought had been scraped or lifted from other sites. Be wise folks, check it first. Use plagiarisma.net or get yourself a Copyscape key. On a side note, let’s get one thing clear. Google does not use or care about "Copyscape". An “80% Copyscape” pass does not mean “Google pass” or any such thing. There is no relationship between a search engines own content algorithm and “Copyscape” percentage at all. The best use of plagiarism checks is just to see whether the work you’ve paid for is unique or not. You still need to read it yourself to see if it is also legible. If it passes both of these tests than is probably safe to use on your site. If I buy content I very rarely get away with publishing it unedited. It nearly always need some work. Weak Content. Three of the sites had so little content compared to the number of inbound links it was staggering. One was a seven-year-old domain selling music/beat software. High ticket items that made over $100 commission per sale to the website owner. The site had 231 words on the main page, a "contact page" and an "about us" page. That was it – the entire site. The “contact” and “about us” pages were standard templates with only the site name changed from the content used on thousands of other domains. The webmaster had even put adsense in the right-hand column to make a little bit of “extra cash”. No video, one image (and that was tagged with the product name). Stacked up against this on the other side Spyglass SEO reported over 25,000 inbound links. Over 100 inbound links per word of content. If I need to tell you that this is not a realistic linking profile then you should really go back to do your on-site and off-site SEO “101” again. Although this was the most extreme case I found 9 of the 11 sites had less than 2 ½ thousand words of unique content on them in total – and I’m counting the content the webmaster “thought” was unique – but in fact wasn’t in these figures. The average number of inbound “links per word” across all 11 sites was greater than one. I should stress here are not talking about Google verified links as found in Webmaster tools, but those found by Spyglass SEO or similar. Remember, Google sees your links whether it bothers to count them in webmaster tools or not. Don’t be naïve and think that Google’s algorithm does not find all of your links better than $19 a month checking tool – it does. It just chooses not to show you the ones it thinks are valueless. And just because it chooses not to show you those links don’t think that it doesn’t consider them to be spam and potentially penalise your site to having them. In these days when negative SEO seems to work (at least in the short term) it almost definitely does. Anchor Text. The last issue, and the only one that is explicitly to do with off-site SEO is the variation in anchor text. I leave this to last although it could be the most important of them all. You see, up until a few months ago nobody really varied their anchor text. Ok, perhaps those very forward thinking people did who could understand that a slap might come one day, but up until July 2012 it had not really been an issue. In fact best practice and just about all the ebooks you could buy on the subject told you to home in on selected keywords. Again, in every case of the 11 I looked at anchor text variation was virtually zero. If there was variation it broke the compound word rules. In other words using “widget” “blue widget” “best blue widget” etc. The problem with that is every anchor text contains the word widget. It will be counted every time whether it’s alongside other words or not. 33% should contain the key word you’re after as a maximum the other 67% should be a mix of generic “click or visit” terms and of course your naked URL. I don’t think I’ve ever written a post before with this frame of mind. Some of the people I did reports for are guys and girls who’ve been marketing on the internet for five years or more. Every web master who contacted me claimed to have a good site, they told me how they had been scratching their heads and unaware of what possibly could have gone wrong. Stop blaming backlinks and “slaps”. Sometimes the problem is your site A poor site. Poor, over optimised, sparsely populated, badly constructed sites will not support gazillions of inbound links whether you use anchor variation or not. A sensible scaling of your link building efforts should be the first port of call. Adding content to your site is one of the quickest and most effective ways to get yourself out of either the Penguin all the Panda slap. Fresh content and as much of it as you can create for your site. Write it yourself to avoid being scammed, or go to a well trusted and respected content marketing company and pay a reasonable price for it. You can diversify your link anchor text by adding generic ones, try keeping context and linking consistent over a decent period of time. Ensure you don’t over optimised keywords, be aware of compound keywords both on-site and with your anchor text. Remember, just because Google doesn’t show 90% of your links in Webmaster tools, it knows they are there. Use another tool which reports all links to see what your real backlink profile is like. From now on I’m going to do a simple “litmus test” on any site I’m asked to appraise. A five-minute check to see whether breaks these rules. Like many, I’m happy to help, would love to see people succeed in their business using time effective best practice methods. But you have to help yourself, don’t be in denial. Did you really check your content? Are your links that good? Have you appraised your onsite optimization? So you honestly expect a 231 word site to make you $4000 a month forever by using a $57 a month linking subscription on autopilot? Really? Are you THAT naïve?