Never do it this way if you want to rank high! Crucial SEO mistakes

LinkBuildingServices

Elite Member
Joined
Feb 23, 2011
Messages
1,967
Reaction score
168
Website
bit.ly
SEO is an important promotional tool for any online business. If your site is not presented in Google, you will have trouble finding new customers interested in your services. About half of surveyed marketers involved in the business admitted that more than 25% of their customers find their site using search systems. Thus we've made up a checklist of basic SEO principles everyone should follow. For SEO guru they may seem obvious, but newbies will find it useful, that’s for sure :)

1. Duplication / content theft. Content is duplicated when 2 or more pages intentionally or accidentally contain the same information. For search “spiders”, each unique URL, which they can find is a single site page, even if different addresses refers to the same document. Search engine robots usually find new addresses for the links on the pages, which they already know. Links can be both internal (within the site) and external, i.e. from another resource. Webmasters often create different URL-addresses that lead to the same document. Usually, this is not intentional, but the problem of content duplication occurs in any case. Duplicated content is especially widespread in large, dynamic web-sites, but small sites are often faced with it as well. Content theft has been a major issue for many webmasters and site owners. Sometimes you can spend weeks creating a unique article, but if someone steals it and gets it indexed by Google faster than you - you’ll lose both traffic and content uniqueness.

image2.png


2. Worse page indexing. To load site information into the database, crawler spends resources – computing power and electricity. Search engines (SE) owners are trying to save resources, so do not spend them uselessly. Therefore, if the SE determines that the same information is located on many pages, it can stop scanning and indexing the site. At best, SE spiders will stop re-scan the pages of the site as often as a webmaster needs; in the worst case no new pages will be indexed, even if the information on them is completely unique.

3. Increased likelihood of penalties from the SE. Sanctions (filters) from SE lower site position for 30-100 places, and stop traffic, coming from Google. Duplicate content increases sanctions likelihood. Many webmasters mistakenly believe that the more pages are on the site the better. They’re trying to index thousands of duplicated pages or any other pages that are useless for users. For example, pages with the results of internal site search on thousands of different requests. This practice is especially dangerous as sanctions are harsh.

4. Low-quality content. Relevant content should be effective not only in terms of classical SEO promotion, but also in terms of behavioral factors. If a webmaster publishes a unique text with optimal keywords distribution, he will definitely get traffic from SE. However, if such texts would not meet the requirements of the readers, the resource will soon generate adverse behavioral characteristics that will drop its position.

image3.jpeg


5. Excessive optimization. This reason is often laid in the wrong determining of website performance metrics. Over-optimized site is not really relevant, it just simulates the relevance. Any excess of the measure reduces the conversion, as the headers become unreadable. Over-optimization is often considered as spamming (content oversaturation with keywords). Search engines keep track of these signals and try to reduce positions of such sites. Google primarily filters over-optimized texts, copy-paste and low-quality content.

6. Lack of metadata. Many people understand the importance of metadata, but do not know how to work with it. Adding a pile of keywords and necessary steps to enable the resource search is not exactly what you need. Your content strategy (as well as for tags and other elements) should focus on the target audience and “searchers behavior”.

7. Lack of adaptation to mobile traffic. Mobile SEO means site optimization for mobile search. Now, more than 50% of all internet users use smartphones, tablets and other devices for web surfing. So improve your website the way Google wants.

google-mobile-friendly-test.png


8. Low download speed. Google has confirmed long ago that website loading speed affects rankings. Basically, optimization assumes changing the speed of image loading, caching, compression, graphics and pages on the server.

9. Lack of user-friendly error page. This is just another ranking factor claimed by Google.

10. Lack of updates on the site. Creation of quality content, design revision and SEO are just some of the essential steps in creating a professional online project. Search engines like sites with active and constantly updated content. This can be done by using a thematic blog, social networks integration, or simply updating the main page.

11. Automatic or manual site registration in directories. Search engines consider the emergence of dozens of backlinks as a sign of manipulation.

12. Manual/automatic posting in the forums. This linkbuilding method is hopelessly outdated.

forum-marketing-james-harkin-600x300.png


13. Blogs commenting. If you leave few comments on different platforms, nothing will happen. This will in no way affect the credibility of your website. But if you leave hundreds of spam comments, your website can get sanctions.

14. References in the site’s footer. If you are web-developer who provides internet marketing services, your partners often refer to you, and search engines regard such links as normal. But if you are selling spare parts for cars, or write about finance, the large number of links to footers looks suspicious. If links from the footers are not closed by “nofollow” attribute and have anchors with keywords, you can get sanctions.

15. Broken links. Periodically check site for the presence of broken links, using Free Link Checker or a similar tool.

16. Improper filling of “robots.txt”. If you do not want to close pages from indexing, simply do not fill this file.

17. Improper use of “noindex” tag.

18. Absence or incorrect use of redirects.

19. The use of multiple H1 headers on the page. There are basic rules of how many H1, H2 and other headings should be on page, be diligent to learn them.

HTML-Guidelines-for-Usability-SEO.jpg


20. Absence of site map. Without it, the SE spiders can index your site incorrectly.

Have you got anything to add? Your comments are much appreciated.
 
When It comes to Adsense, Do not add too many ads to a page and moreover avoid using more than one ad at the top of the page. Your site might just be in a huge problem if you do so.
 

ThatSEO, thanks for taking time to check that out. We've got a lot more helpful information on our blog, but we in no way intend to copy-paste everything from our blog to this forum. You might have guessed that not all forum users are our blog readers, so we thought it might be appropriate to put one of our best picks and place it here. We really think people who are new to online marketing and SEO might find something helpful in this read. We're not providing any blog links, nor do we intend to advertise anything with this post. If moderators find that this violates any of the rules - they're free to delete it anytime. No offense
 
I didn't say you broke rules and it's not my job to decide, or that I care :)

Just you kinda broke one of your own rules ;)
 
Good rUn down for newer people. thanks
 
Just you kinda broke one of your own rules ;)

he didnt actualy
its not suplicate contnt on his site
its duplicate content on BHW
 
I didn't say you broke rules and it's not my job to decide, or that I care :)

Just you kinda broke one of your own rules ;)

i felt like someone just like you is gonna come and say " oh but he isn't trying to rank this thread " :D
 
Also popular mistakes:

You don’t have a proper plan or a roadmap to do SEO for your website
You don’t have a proper checklist of SEO elements you have to fix. Even if it’s there, you have it all wrong
You have made mistakes in selecting keywords for your brand
Wrong keywords selection leads to wrong keywords research and analysis

Since you know the quote, ‘Content is King’, you are producing hell lot of content to impress search engines but not humans

You are not giving attention to the search query report on your Google analytics.
Keywords are important but you are exceeding the density of keywords usage on your website
You are doing spammy link building activities and getting hammered badly by Google for black hat SEO
You are not doing competitive analysis and benchmarking
 
Content is duplicated when 2 or more pages intentionally or accidentally contain the same information. For search “spiders”, each unique URL, which they can find is a single site page, even if different addresses refers to the same document.

Does duplicate content mean 100% the same content across the whole page?

What if let's say just a portion of the page was the same as another?

Lets say if you're creating a recipe site for variants of apple pie.

So apple pie page 1, 2, 3, 4, 5 recipe have a similar structure in the sequence of how you write the recipe, but maybe they use different types of flour, sugar etc.

For example:

*Pour the {self raising | plain} flour in a bowl
*Add some {white | brown | pink} sugar and stir it

And of course different images.

Basically just wondering if it works the way a plagiarism checker does
 
Does duplicate content mean 100% the same content across the whole page?

What if let's say just a portion of the page was the same as another?

Lets say if you're creating a recipe site for variants of apple pie.

So apple pie page 1, 2, 3, 4, 5 recipe have a similar structure in the sequence of how you write the recipe, but maybe they use different types of flour, sugar etc.

For example:

*Pour the {self raising | plain} flour in a bowl
*Add some {white | brown | pink} sugar and stir it

And of course different images.

Basically just wondering if it works the way a plagiarism checker does

Something like your example would be fine as there is only a few ways to write Pour the flour in a bowl. However, the rest of the article would have to be fresh content
 
i read a post on some site that said https and shorter Urls is a ranking factor. Di i need to worry about that

Yes, https is a ranking factor and it is very important for Google. It is known that in future when a user visits a website that do not contain https, he will see a message that this website is not safe. So this is really important.
As for short urls, just mind that they should not be too long, this is about usability. Url should contain keyword and be relevant to your topic.


Does duplicate content mean 100% the same content across the whole page?

What if let's say just a portion of the page was the same as another?

Lets say if you're creating a recipe site for variants of apple pie.

So apple pie page 1, 2, 3, 4, 5 recipe have a similar structure in the sequence of how you write the recipe, but maybe they use different types of flour, sugar etc.

For example:

*Pour the {self raising | plain} flour in a bowl
*Add some {white | brown | pink} sugar and stir it

And of course different images.

Basically just wondering if it works the way a plagiarism checker does

Google checks the whole page including sidebar menu, footer and banner texts, comments. And if it is not 100% unique there is nothing to worry about. If such a situation occurs (pages with similar patches of text like in your example with apple pie), you should try stimulating your visitors to leave their comments.
There is also a way to show Google which pages (apple pie2, apple pie3) are closely connected with the apple pie1 so it does not consider all of them not unique. If you are interested in this, I can tell you more.
 
i read a post on some site that said https and shorter Urls is a ranking factor. Di i need to worry about that

Yes, HTTPS and shorter URLs are a ranking factor. It's good to have a site that uses HTTPS encrypts the user’s data, because it gives a better experience for the user, and it can also help your site gain rankings.
Have a URL that clearly tells the user what the article is about by using the main keywords.
 
Back
Top
AdBlock Detected

We get it, advertisements are annoying!

Sure, ad-blocking software does a great job at blocking ads, but it also blocks useful features and essential functions on BlackHatWorld and other forums. These functions are unrelated to ads, such as internal links and images. For the best site experience please disable your AdBlocker.

I've Disabled AdBlock