1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

I made money on day one, will not fear the duplicate content penalty cause their is none!

Discussion in 'Black Hat SEO' started by seated, Dec 23, 2009.

  1. seated

    seated Regular Member

    Joined:
    Jan 15, 2009
    Messages:
    219
    Likes Received:
    139
    Occupation:
    Production Manager
    Location:
    Sunny South Florida
    I'm new here and want to thank the many folks who really go out of their way to help with true answers.

    I want to show you the truth about autobloggs and how you can make money.

    After reading a ton of WSO's, I saw over and over their is no real " google duplicate content penalty" or DCP .

    Well how can that be, I see DCP warning on BHW daily. So I did some search on my own, here what I found. The DCP is in many cases misinformation.

    The following article was posted on Webmaster Central Blog
    Please read the entire article.

    For information on Google's, please go to:
    http://googlewebmastercentral.blogspot.com/ and search the word "Duplicate"

    Demystifying the "duplicate content penalty"
    Friday, September 12, 2008 at 8:30 AM

    Duplicate content. There's just something about it. We keep writing about it, and people keep asking about it. In particular, I still hear a lot of webmasters worrying about whether they may have a "duplicate content penalty."

    Let's put this to bed once and for all, folks: There's no such thing as a "duplicate content penalty." At least, not in the way most people mean when they say that.

    There are some penalties that are related to the idea of having the same content as another site—for example, if you're scraping content from other sites and republishing it, or if you republish content without adding any additional value. These tactics are clearly outlined (and discouraged) in our Webmaster Guidelines:

    * Don't create multiple pages, subdomains, or domains with substantially duplicate content.

    * Avoid... "cookie cutter" approaches such as affiliate programs with little or no original content.

    * If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first.

    (Note that while scraping content from others is discouraged, having others scrape you is a different story; check out this post if you're worried about being scraped.)

    But most site owners whom I hear worrying about duplicate content aren't talking about scraping or domain farms; they're talking about things like having multiple URLs on the same domain that point to the same content. Like www.example.com/skates.asp?color=black&brand=riedell and www.example.com/skates.asp?brand=riedell&color=black. Having this type of duplicate content on your site can potentially affect your site's performance, but it doesn't cause penalties. From our article on duplicate content:

    Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results. If your site suffers from duplicate content issues, and you don't follow the advice listed above, we do a good job of choosing a version of the content to show in our search results.

    This type of non-malicious duplication is fairly common, especially since many CMSs don't handle this well by default. So when people say that having this type of duplicate content can affect your site, it's not because you're likely to be penalized; it's simply due to the way that web sites and search engines work.

    Most search engines strive for a certain level of variety; they want to show you ten different results on a search results page, not ten different URLs that all have the same content. To this end, Google tries to filter out duplicate documents so that users experience less redundancy. You can find details in this blog post, which states:

    1. When we detect duplicate content, such as through variations caused by URL parameters, we group the duplicate URLs into one cluster.
    2. We select what we think is the "best" URL to represent the cluster in search results.
    3. We then consolidate properties of the URLs in the cluster, such as link popularity, to the representative URL.

    Here's how this could affect you as a webmaster:

    * In step 2, Google's idea of what the "best" URL is might not be the same as your idea. If you want to have control over whether www.example.com/skates.asp?color=black&brand=riedell or www.example.com/skates.asp?brand=riedell&color=black gets shown in our search results, you may want to take action to mitigate your duplication. One way of letting us know which URL you prefer is by including the preferred URL in your Sitemap.
    * In step 3, if we aren't able to detect all the duplicates of a particular page, we won't be able to consolidate all of their properties. This may dilute the strength of that content's ranking signals by splitting them across multiple URLs.

    In most cases Google does a good job of handling this type of duplication. However, you may also want to consider content that's being duplicated across domains. In particular, deciding to build a site whose purpose inherently involves content duplication is something you should think twice about if your business model is going to rely on search traffic, unless you can add a lot of additional value for users. For example, we sometimes hear from Amazon.com affiliates who are having a hard time ranking for content that originates solely from Amazon. Is this because Google wants to stop them from trying to sell Everyone Poops? No; it's because how the heck are they going to outrank Amazon if they're providing the exact same listing? Amazon has a lot of online business authority (most likely more than a typical Amazon affiliate site does), and the average Google search user probably wants the original information on Amazon, unless the affiliate site has added a significant amount of additional value.

    Lastly, consider the effect that duplication can have on your site's bandwidth. Duplicated content can lead to inefficient crawling: when Googlebot discovers ten URLs on your site, it has to crawl each of those URLs before it knows whether they contain the same content (and thus before we can group them as described above). The more time and resources that Googlebot spends crawling duplicate content across multiple URLs, the less time it has to get to the rest of your content.

    In summary: Having duplicate content can affect your site in a variety of ways; but unless you've been duplicating deliberately, it's unlikely that one of those ways will be a penalty. This means that:

    * You typically don't need to submit a reconsideration request when you're cleaning up innocently duplicated content.
    * If you're a webmaster of beginner-to-intermediate savviness, you probably don't need to put too much energy into worrying about duplicate content, since most search engines have ways of handling it.
    * You can help your fellow webmasters by not perpetuating the myth of duplicate content penalties! The remedies for duplicate content are entirely within your control. Here are some good places to start.

    Posted by Susan Moskwa, Webmaster Trends Analyst

    With is information in hand I created a blog yesterday.
    Here is a the breakdown.

    The blog was one day old, and got 85% of my vistiors from the 10 backlinks I created yesterday. Heres the proof.

    [​IMG]


    I made $6.40 on this blog and another .30 odd cent on a brand new blog with no backlinks.

    [​IMG]

    I hope this helps, now go make some money.
     
    • Thanks Thanks x 1
  2. seated

    seated Regular Member

    Joined:
    Jan 15, 2009
    Messages:
    219
    Likes Received:
    139
    Occupation:
    Production Manager
    Location:
    Sunny South Florida
    I know, but I have no idea how to do it.
     
  3. tygrus

    tygrus Supreme Member

    Joined:
    Mar 28, 2009
    Messages:
    1,237
    Likes Received:
    827
    Occupation:
    Engineer
    Location:
    Canada
    You got 600 uv the first day after building your blog and 10 backlinks? I would say thats pretty amazing and not typically the norm regardless of of whether you have duplicate content or not.
     
  4. ryce889

    ryce889 Regular Member

    Joined:
    Jul 23, 2008
    Messages:
    490
    Likes Received:
    152
    Location:
    NY, NY
    thanks for the share - always was worried about making my content as unique as possible
     
  5. peter2002

    peter2002 Senior Member

    Joined:
    Jul 8, 2009
    Messages:
    1,120
    Likes Received:
    24,314
    Occupation:
    Internet Marketer, Degrees in Business and Psychol
    Location:
    USA
    I am also convinced there is no content duplicate penalty. I used it many times in different niches, using only duplicate content for my blogs. I didn't get penalized even once. On the contrary, I got some very good Google rankings (up to top 5) for certain keywords.
     
  6. seated

    seated Regular Member

    Joined:
    Jan 15, 2009
    Messages:
    219
    Likes Received:
    139
    Occupation:
    Production Manager
    Location:
    Sunny South Florida
    The have dropped, but I will setup 20 more links after Xmas.
     
  7. jtliewim

    jtliewim Junior Member

    Joined:
    Jan 16, 2009
    Messages:
    100
    Likes Received:
    61
    10 backlinks and manage to get 600 uv. That's really amazing.

    How many contents / posts you have in that blog ?
     
  8. 13future

    13future Newbie

    Joined:
    May 20, 2008
    Messages:
    31
    Likes Received:
    4
    600 uv is easy if all of his backlinks are from Digg,Redditt,Stumbleupon,etc....actually anyone of these can get 600 pretty easily.
     
  9. richcamp

    richcamp Regular Member

    Joined:
    Oct 5, 2009
    Messages:
    315
    Likes Received:
    119
    yeah, I still on the fence on this one. I believe what Google is refering to is duplicate content as a result of technicality e.g messy page link structure or redirection and not a duplicate content as blackhatters refers to e.g autoblog, scrappers etc.

    Thats why even once big autoblog such as techchuck (thanks to BHW) and mashget are still get deindexed.

    This shouldn't discourage you from doing autoblog, its just that you should take caution in doing it.
     
  10. jokel661

    jokel661 Regular Member

    Joined:
    Sep 26, 2009
    Messages:
    437
    Likes Received:
    495
    Occupation:
    Full Time IM & Developer
    Location:
    NYC & Prague
    Read This PDF and you are going to remember me."The Mystery Of The Duplicate Content Penalty"
     

    Attached Files:

    • Thanks Thanks x 3
  11. Blackhat_Boy

    Blackhat_Boy Newbie

    Joined:
    Oct 2, 2009
    Messages:
    48
    Likes Received:
    235
    thanks for the share
     
  12. richcamp

    richcamp Regular Member

    Joined:
    Oct 5, 2009
    Messages:
    315
    Likes Received:
    119
    That is the best report on duplicate content I read so far. It made a lot of sense and very easy to follow especially with all of those screen caps.

    I kindda believed it all along but this report confirms my worries. Good stuff
     
  13. rey2k5

    rey2k5 Junior Member

    Joined:
    Nov 16, 2009
    Messages:
    130
    Likes Received:
    27
    Nice share. I also have a few sites that contain lots of duplicate content... but are ranking fine. It is all about the backlinks I guess. :)