goingreen
Regular Member
- Joined
- May 7, 2010
- Messages
- 309
- Reaction score
- 311
The whole idea of getting penalized for duplicate content is more about having duplicate pages on the same site.
Back in the day before google came into the picture, getting indexed in search engines was based completely on onpage factors. One of the early keyword tactics that marketers found was that if you created a landing page that contained a keyword in the filename it would help get that page indexed for that keyword. It didn't take long for webmasters to figure out they could just create a thousand identical landing pages with only the filename changed. So you ended up with websites having many duplicate landing pages like mysite.com/keyword1.htm, mysite.com/keyword2.htm, mysite.com/keyword3.htm, etc. This would get each of the pages into the Yahoo index for the chosen keywords. (This was also the heyday for keyword stuffing in the meta tags and hidden content).
It didn't take long for the search engines to figure this out, so duplicate content on the same website soon became a negative factor. By the time Google came along in the late 90's, duplicate content and keyword stuffing were pretty standard negative factors among the SEs.
Having duplicate content on seperate websites is a completely different thing. It's never really been penalized, but the SE do try to determine which site is the main source. Their goal is to have the main source show up as the most relevant site for that content, but they've never actually been that good achieving it. Even if you are the original source for an article, you still need a lot of SEO to make sure your site will be found.
Duplicate content on seperate websites mostly became a big issue because of copyright and article directories. The article directories want to be sure that anything submitted actually belongs to the submitter and they also want to make sure that they have unique content so they stand out as "better than the rest". So when people started submitting articles to the directories for the purpose of SEO, not having duplicate content became a pretty big issue. Even more so for the people trying to make money doing article submissions as their chosen method.
edit - So basically duplicate content as far as SEO is concerned won't absolutely cause problems for your website. The idea is based on some real factors, but has grown to become a misleading myth. In some cases duplicate content might be a negative factor and in other cases it could actually be good for your site. Just like many other factors, it really depends on how your using it.
It would be very interesting to see what happens if you change the content to a non-dublicate-version.
According to your experience this should than result in a much better ranking, I suppose.
Willing to try this?
Why do you even speculate about things when there is a clear statement from google?
It's interesting from
0:46 to 3:30 and especially at 2:53
To sum it up:
There is no "duplicate content penalty" but there is a penalty on spam (lower rankings or deindexing) and duplicate content is one indicator of spam.
So, I set up a simple wordpress.com blog about the subject and posted a couple of good quality PLR articles.
We get it, advertisements are annoying!
Sure, ad-blocking software does a great job at blocking ads, but it also blocks useful features and essential functions on BlackHatWorld and other forums. These functions are unrelated to ads, such as internal links and images. For the best site experience please disable your AdBlocker.