LinkBuildingServices
Elite Member
SEO is an important promotional tool for any online business. If your site is not presented in Google, you will have trouble finding new customers interested in your services. About half of surveyed marketers involved in the business admitted that more than 25% of their customers find their site using search systems. Thus we've made up a checklist of basic SEO principles everyone should follow. For SEO guru they may seem obvious, but newbies will find it useful, that’s for sure 
1. Duplication / content theft. Content is duplicated when 2 or more pages intentionally or accidentally contain the same information. For search “spiders”, each unique URL, which they can find is a single site page, even if different addresses refers to the same document. Search engine robots usually find new addresses for the links on the pages, which they already know. Links can be both internal (within the site) and external, i.e. from another resource. Webmasters often create different URL-addresses that lead to the same document. Usually, this is not intentional, but the problem of content duplication occurs in any case. Duplicated content is especially widespread in large, dynamic web-sites, but small sites are often faced with it as well. Content theft has been a major issue for many webmasters and site owners. Sometimes you can spend weeks creating a unique article, but if someone steals it and gets it indexed by Google faster than you - you’ll lose both traffic and content uniqueness.
2. Worse page indexing. To load site information into the database, crawler spends resources – computing power and electricity. Search engines (SE) owners are trying to save resources, so do not spend them uselessly. Therefore, if the SE determines that the same information is located on many pages, it can stop scanning and indexing the site. At best, SE spiders will stop re-scan the pages of the site as often as a webmaster needs; in the worst case no new pages will be indexed, even if the information on them is completely unique.
3. Increased likelihood of penalties from the SE. Sanctions (filters) from SE lower site position for 30-100 places, and stop traffic, coming from Google. Duplicate content increases sanctions likelihood. Many webmasters mistakenly believe that the more pages are on the site the better. They’re trying to index thousands of duplicated pages or any other pages that are useless for users. For example, pages with the results of internal site search on thousands of different requests. This practice is especially dangerous as sanctions are harsh.
4. Low-quality content. Relevant content should be effective not only in terms of classical SEO promotion, but also in terms of behavioral factors. If a webmaster publishes a unique text with optimal keywords distribution, he will definitely get traffic from SE. However, if such texts would not meet the requirements of the readers, the resource will soon generate adverse behavioral characteristics that will drop its position.
5. Excessive optimization. This reason is often laid in the wrong determining of website performance metrics. Over-optimized site is not really relevant, it just simulates the relevance. Any excess of the measure reduces the conversion, as the headers become unreadable. Over-optimization is often considered as spamming (content oversaturation with keywords). Search engines keep track of these signals and try to reduce positions of such sites. Google primarily filters over-optimized texts, copy-paste and low-quality content.
6. Lack of metadata. Many people understand the importance of metadata, but do not know how to work with it. Adding a pile of keywords and necessary steps to enable the resource search is not exactly what you need. Your content strategy (as well as for tags and other elements) should focus on the target audience and “searchers behavior”.
7. Lack of adaptation to mobile traffic. Mobile SEO means site optimization for mobile search. Now, more than 50% of all internet users use smartphones, tablets and other devices for web surfing. So improve your website the way Google wants.
8. Low download speed. Google has confirmed long ago that website loading speed affects rankings. Basically, optimization assumes changing the speed of image loading, caching, compression, graphics and pages on the server.
9. Lack of user-friendly error page. This is just another ranking factor claimed by Google.
10. Lack of updates on the site. Creation of quality content, design revision and SEO are just some of the essential steps in creating a professional online project. Search engines like sites with active and constantly updated content. This can be done by using a thematic blog, social networks integration, or simply updating the main page.
11. Automatic or manual site registration in directories. Search engines consider the emergence of dozens of backlinks as a sign of manipulation.
12. Manual/automatic posting in the forums. This linkbuilding method is hopelessly outdated.
13. Blogs commenting. If you leave few comments on different platforms, nothing will happen. This will in no way affect the credibility of your website. But if you leave hundreds of spam comments, your website can get sanctions.
14. References in the site’s footer. If you are web-developer who provides internet marketing services, your partners often refer to you, and search engines regard such links as normal. But if you are selling spare parts for cars, or write about finance, the large number of links to footers looks suspicious. If links from the footers are not closed by “nofollow” attribute and have anchors with keywords, you can get sanctions.
15. Broken links. Periodically check site for the presence of broken links, using Free Link Checker or a similar tool.
16. Improper filling of “robots.txt”. If you do not want to close pages from indexing, simply do not fill this file.
17. Improper use of “noindex” tag.
18. Absence or incorrect use of redirects.
19. The use of multiple H1 headers on the page. There are basic rules of how many H1, H2 and other headings should be on page, be diligent to learn them.
20. Absence of site map. Without it, the SE spiders can index your site incorrectly.
Have you got anything to add? Your comments are much appreciated.
1. Duplication / content theft. Content is duplicated when 2 or more pages intentionally or accidentally contain the same information. For search “spiders”, each unique URL, which they can find is a single site page, even if different addresses refers to the same document. Search engine robots usually find new addresses for the links on the pages, which they already know. Links can be both internal (within the site) and external, i.e. from another resource. Webmasters often create different URL-addresses that lead to the same document. Usually, this is not intentional, but the problem of content duplication occurs in any case. Duplicated content is especially widespread in large, dynamic web-sites, but small sites are often faced with it as well. Content theft has been a major issue for many webmasters and site owners. Sometimes you can spend weeks creating a unique article, but if someone steals it and gets it indexed by Google faster than you - you’ll lose both traffic and content uniqueness.

2. Worse page indexing. To load site information into the database, crawler spends resources – computing power and electricity. Search engines (SE) owners are trying to save resources, so do not spend them uselessly. Therefore, if the SE determines that the same information is located on many pages, it can stop scanning and indexing the site. At best, SE spiders will stop re-scan the pages of the site as often as a webmaster needs; in the worst case no new pages will be indexed, even if the information on them is completely unique.
3. Increased likelihood of penalties from the SE. Sanctions (filters) from SE lower site position for 30-100 places, and stop traffic, coming from Google. Duplicate content increases sanctions likelihood. Many webmasters mistakenly believe that the more pages are on the site the better. They’re trying to index thousands of duplicated pages or any other pages that are useless for users. For example, pages with the results of internal site search on thousands of different requests. This practice is especially dangerous as sanctions are harsh.
4. Low-quality content. Relevant content should be effective not only in terms of classical SEO promotion, but also in terms of behavioral factors. If a webmaster publishes a unique text with optimal keywords distribution, he will definitely get traffic from SE. However, if such texts would not meet the requirements of the readers, the resource will soon generate adverse behavioral characteristics that will drop its position.

5. Excessive optimization. This reason is often laid in the wrong determining of website performance metrics. Over-optimized site is not really relevant, it just simulates the relevance. Any excess of the measure reduces the conversion, as the headers become unreadable. Over-optimization is often considered as spamming (content oversaturation with keywords). Search engines keep track of these signals and try to reduce positions of such sites. Google primarily filters over-optimized texts, copy-paste and low-quality content.
6. Lack of metadata. Many people understand the importance of metadata, but do not know how to work with it. Adding a pile of keywords and necessary steps to enable the resource search is not exactly what you need. Your content strategy (as well as for tags and other elements) should focus on the target audience and “searchers behavior”.
7. Lack of adaptation to mobile traffic. Mobile SEO means site optimization for mobile search. Now, more than 50% of all internet users use smartphones, tablets and other devices for web surfing. So improve your website the way Google wants.

8. Low download speed. Google has confirmed long ago that website loading speed affects rankings. Basically, optimization assumes changing the speed of image loading, caching, compression, graphics and pages on the server.
9. Lack of user-friendly error page. This is just another ranking factor claimed by Google.
10. Lack of updates on the site. Creation of quality content, design revision and SEO are just some of the essential steps in creating a professional online project. Search engines like sites with active and constantly updated content. This can be done by using a thematic blog, social networks integration, or simply updating the main page.
11. Automatic or manual site registration in directories. Search engines consider the emergence of dozens of backlinks as a sign of manipulation.
12. Manual/automatic posting in the forums. This linkbuilding method is hopelessly outdated.

13. Blogs commenting. If you leave few comments on different platforms, nothing will happen. This will in no way affect the credibility of your website. But if you leave hundreds of spam comments, your website can get sanctions.
14. References in the site’s footer. If you are web-developer who provides internet marketing services, your partners often refer to you, and search engines regard such links as normal. But if you are selling spare parts for cars, or write about finance, the large number of links to footers looks suspicious. If links from the footers are not closed by “nofollow” attribute and have anchors with keywords, you can get sanctions.
15. Broken links. Periodically check site for the presence of broken links, using Free Link Checker or a similar tool.
16. Improper filling of “robots.txt”. If you do not want to close pages from indexing, simply do not fill this file.
17. Improper use of “noindex” tag.
18. Absence or incorrect use of redirects.
19. The use of multiple H1 headers on the page. There are basic rules of how many H1, H2 and other headings should be on page, be diligent to learn them.

20. Absence of site map. Without it, the SE spiders can index your site incorrectly.
Have you got anything to add? Your comments are much appreciated.