Discussion in 'Black Hat SEO' started by sagarpatil, Apr 8, 2012.
Any detailed insights into how Google differentiates original content from duplicate content?
I don't know how they evaluate the uniqueness of content although I would think that they have a much more complicated system than copyscape.
Unique does not mean "written once - posted once"
Syndication of good quality content if not overdone is absolutely fine with them.
Keep the numbers down
Always link back to the original source
Despite the possibility of being flamed here, I'd say they don't have a very good system at all for checking. Sometimes they seem to "pick a niche" and "go deep" on analysis, this seems more prevalent in spammy niches.
Link velocity and source seems to be more of a factor. They look for an "appropriate linking pattern" built up over time, with social signals and across a range of platforms. If you do that you tend to win in the end no matter if you syndicate now and again or not.
I can't post URLs yet, so do this: In gogole, type "duplicate content" and read the first URL that's a google support URL
it's discussed at great lengths and in good detail here
Google must be judging the content with a tool that should give equal importance to both uniqueness and high quality. Content of low quality with keywords stuffing should also be penalized.
Separate names with a comma.