How important is unique content, really?

xxdanxx

Newbie
Joined
Aug 6, 2012
Messages
23
Reaction score
28
I just have a quick question. We all know the search engines want unique content. How is it, then that news aggregation sites are able to rank?

I'm not talking about original sources of news. I'm talking about sites that are basically nothing more than copy and paste from the original source.

Thanks!
 
hmm i didn't understand your question ! only sites with high authority and fresh news and updated daily are able to rank ,but i use T.V for news
 
Since the beginning of the internet, it is the single most important thing about a website..... it will continue to be the most important thing about a website until the earth crashes into the sun, and assuming wifi isn't available in space.
 
Ok, being more specific--Sites like DrudgeReport are nothing but aggregators of news posted on other sites. These "news" stories are nothing more than links to the original source, yet rank pretty high. There's very little if any original content. How can this be?
 
Well, I'm currently on a journey thread testing out unique content alone without SEO to see where it ranks.

Check it out.

Cheers.
 
Very important, unique high quality content is something definitely worth investing in.
 
I was making websites for a particular niche and only got to slot #4 for the main keyword then I got penalized for years. I was only doing sales pages.

I went back to the same niche and wrote a lot of unique articles and I'm ranked very high in a short amount of time for those keywords. Consistently, too

It matters a ton.
 
I've never seen it mentioned on here, but on warrior forum the difference between 'duplicate' content and 'syndicated' content gets discussed pretty regularly. The fact that it never gets discussed here and does on the warrior forum suggests that it might be BS, but it does explain how news aggregate sites rank as news articles are treated as syndicated content.
 
if G consideres your site to be an autohority - you can syndicate news/content and even rank higher than content source.
 
Just like heart in your body. Unique content is the heart of your site.
 
Dup-Short term(maximum 2-3 month)
Unique-Long Term

i have ranked on dup content ,But there comes problem after few weeks..
 
I've never seen it mentioned on here, but on warrior forum the difference between 'duplicate' content and 'syndicated' content gets discussed pretty regularly. The fact that it never gets discussed here and does on the warrior forum suggests that it might be BS, but it does explain how news aggregate sites rank as news articles are treated as syndicated content.

THANK YOU for actually reading the post and giving an answer. This is what I was looking for.
 
THANK YOU for actually reading the post and giving an answer. This is what I was looking for.

Google's reply to this problem was BRANDS. They hand-tune the algo to privilege brands. How many aggregator sites show up in search results? Not many. A handful. In fact, agregators are penalized, I recall the netscape.com fiasco, it was completely gone from google. The few aggreagators you see in Google are hand-dialed in, they're artificially overvalued by Google like big brands like microsoft.com, yahoo.com and so on. Try starting your own news aggratator and see if you rank at all. You won't. It's the money man! That's what's pretty f'd up about Google lately, it's all about the money and brands.
 
I have a theory on this that might be BS or might be true, I am not an 'SEO GURU'

I have an nice EMD for a medical term that gets X,XXX searches on G a month. I put up a really crap site using a spammy niche website building host, whypark, which I am sure G penalizes for the sheer volume of shite microsites hosted there.

For content I cut and paste wikipedia articles on the exact and very similar medical areas. I started at page 16 and over 4-6 weeks got closer to page 1. I uploaded 7-8 high quality wiki articles, the medical stuff there is decent. I had one backlink, it was a blog comment but from a PR8-9 site like Forbes, cant remember specifically but its was a top site.

Anyway Wiki was ranking No.1 for my keyword and I hit No. 4 then bombed out. My theory is that duplicate content is fine up until the two sites that have the duplicate content come in close proximity in the serps, then the one least trusted gets bombed.

If anyone thinks differently then I would love to hear it, you no doubt know better than I do.

XXXDANXXX

I may try an experiment off the back of this thread. The syndicated content comment, maybe the fact that they have a lot of their own unique content means that they are able to have copied content and get away with it. Maybe G sees, for example, 20% copied content as OK. I may try and rerank the same domain using the copied content again but add 90% unique content, then take away the unique content and at what point(%) it bombs again.
 
I have a theory on this that might be BS or might be true, I am not an 'SEO GURU'

I have an nice EMD for a medical term that gets X,XXX searches on G a month. I put up a really crap site using a spammy niche website building host, whypark, which I am sure G penalizes for the sheer volume of shite microsites hosted there.

For content I cut and paste wikipedia articles on the exact and very similar medical areas. I started at page 16 and over 4-6 weeks got closer to page 1. I uploaded 7-8 high quality wiki articles, the medical stuff there is decent. I had one backlink, it was a blog comment but from a PR8-9 site like Forbes, cant remember specifically but its was a top site.

Anyway Wiki was ranking No.1 for my keyword and I hit No. 4 then bombed out. My theory is that duplicate content is fine up until the two sites that have the duplicate content come in close proximity in the serps, then the one least trusted gets bombed.

If anyone thinks differently then I would love to hear it, you no doubt know better than I do.

XXXDANXXX

I may try an experiment off the back of this thread. The syndicated content comment, maybe the fact that they have a lot of their own unique content means that they are able to have copied content and get away with it. Maybe G sees, for example, 20% copied content as OK. I may try and rerank the same domain using the copied content again but add 90% unique content, then take away the unique content and at what point(%) it bombs again.

Thanks for sharing your experience. Sounds like a plausible theory.
 
They rank because the are very both high PR and authoritative sites. Guessing your thinking of sites like the Guardian, BBC et al. It's a fairly short term tactic though because once another news outlet publishes something better and unique Google will be all over it, game over. Breaking news being what it is means being there first is often best, even if it's something cruddy and scraped.

As for scraping content as a technique for filling an entire site, nope, not a chance. It's a short term tactic and a numbers game.

Anyway Wiki was ranking No.1 for my keyword and I hit No. 4 then bombed out. My theory is that duplicate content is fine up until the two sites that have the duplicate content come in close proximity in the serps, then the one least trusted gets bombed.

Think your kinda about right. I'd also wager that the closer a website gets to ranking for something "important", the more of a chance you will have to pass for a decent read. Could be total tosh i'm talking of course! I tend to launch new sites with fairly rubbish content and then "upgrade" everything to pure quality as i'm hovering at the top of page 2 for my money k/w. This tends to light a rocket under my site and stick it to page 1 fairly well once indexed.
 
Last edited:
Back
Top
AdBlock Detected

We get it, advertisements are annoying!

Sure, ad-blocking software does a great job at blocking ads, but it also blocks useful features and essential functions on BlackHatWorld and other forums. These functions are unrelated to ads, such as internal links and images. For the best site experience please disable your AdBlocker.

I've Disabled AdBlock