Is this statement true, Need some seo pros!!

Discussion in 'Black Hat SEO' started by pussyback, May 25, 2008.

  1. pussyback

    pussyback Regular Member

    Joined:
    Apr 5, 2008
    Messages:
    402
    Likes Received:
    80
    Is this statement true? please help
    "No, the duplicate content penalty only applies to duplicate content within a site. For example if more that one page on a website has exactly the same content. The penalty does NOT apply if there are many sites on different domain names with the same content."
     
  2. Belexandor

    Belexandor Junior Member

    Joined:
    May 4, 2008
    Messages:
    195
    Likes Received:
    194
    Occupation:
    Chairman for the non-profit organization to help h
    Location:
    BHW.
    That's a very interesting question. If it is true then what is there to stop me (or you) from taking a 15000 page website,altering just slightly and doubling the amount of adsense ads,affiliate ads or whatever else your pushing without changing the content. Hell, I could buy a few more domain names and put myself over the 1 million web pages online mark if I'm not going to be penalized for duplicate content. Would like to hear what Nicho or Clix think about this one....
     
  3. crsk8andsno56

    crsk8andsno56 Senior Member

    Joined:
    Jan 20, 2008
    Messages:
    998
    Likes Received:
    577
    Bel, you answered the question...
    If this was true EVERYONE would do it..
    Duplicate content is needed... sorry :-(
     
  4. Pofecker

    Pofecker Senior Member Premium Member

    Joined:
    Apr 4, 2007
    Messages:
    1,135
    Likes Received:
    7,241
    Location:
    In your attic
    It's a false statement. I'm guessing your got ahold of Ad*Sens*e B*ig since the creator of that software put that statement in his FAQ.txt. Keep in mind he's trying to sell you on the idea that duplicate content is ok since that's exactly what he's selling.

    Do yourself the favor, if you must use that script, upload your own articles or run the current ones thru a article rewriter. I'd also suggest changing the template to avoid footprints. It would prolly be easier to build a site from scratch. :p
     
  5. crsk8andsno56

    crsk8andsno56 Senior Member

    Joined:
    Jan 20, 2008
    Messages:
    998
    Likes Received:
    577
    Its SEMI-true I guess.... Depends on how you look at it..
    You definetly wont rank as well with it. especially if the article is spread across 400 sites
     
  6. Belexandor

    Belexandor Junior Member

    Joined:
    May 4, 2008
    Messages:
    195
    Likes Received:
    194
    Occupation:
    Chairman for the non-profit organization to help h
    Location:
    BHW.
    Still not sure about this. I mean, fuck "G's" page rank. If you have adsense sites why not just go for volume. True, you won't get the kind of SE traffic that you would if you had unique content/articles on every page but with enough pages out there it would have to net you some money wouldn't it? Anyone using this volume strategy for adsense?
     
  7. aftershock2020

    aftershock2020 Senior Member

    Joined:
    Oct 19, 2007
    Messages:
    981
    Likes Received:
    478
    Well, I can tell you from my experience about dealing with both...

    Duplicate content on multiple websites is allowed under very strict conditions. The secret to mastering ' duplicate content ' is managing it in ' unique presentation. '

    Websites: Same content, different template, different presentation, different order of listing of relevant content on the site pages.

    Example:

    Site A has a yellow banner with a cute kitten on it and offers articles about how-to train your cat to use a little box, etc.

    Site B has a black banner with a dog with his tongue hanging out, offering the same articles and content in a different order.

    Articles can be displayed on your website from anywhere, as long as they are the original, unchanged article from the directory you get them from and they are relevant to your site. They will absolutely NOT be counted as duplicate content due to any and all articles posted within a directory fall under the ' public domain ' laws of the international world, ( just like any press release or news broadcast), to used within it's original presentation format, unchanged or modified and clearly posted as the original presenter's content, AS A REFERENCE OF KEY INFORMATION to be used freely for content building of targeted information resource sites, also known as ' authority niches. '

    The legality of it is in leaving the content, INCLUDING the original author information intact via links through those directory accounts. With that said, you can have as many duplicate sites as you want with the same target content as long as cosmetically, it is presented in a different presentation visually and by random navigation order to differe from the others. Proof...Every site that is powered but Mike Filsaime's butterfly marketing server is an exact duplicate, all selling his same exact ads and products with a different main product as the lead...it's duplicating the same information in a unique presentation.

    Note that ALL of them rank for different keywords in the same niche so that he dominates them all. All of the ' guru ' do this. Joel Comm, John Reese, Frank Kern, Mark Flavin, RUSSELL BRUNSON...the king of cloning content...heh.

    Nothing new, just not made obvious. That's my insider tip of the day, folks. Check it out and compare for yourselves...do a google search for these guys by name and read the content. It's all the same with a different package for their individual styles.

    It is exactly what they teach...' The best way to mislead the enemy of your plan is to tell them exactly what you are going to do. ' They find something that makes A little bit of money and then duplicate it. This is how they got around the google slaps.

    That's why I laugh when people say that web 2.0 is so outstanding, it's the same techniques that have always worked, just wrapped in a different wrapper. The ranking issue is simply mentioned above but here is a more direct check list...Pay close attention because it is going to seem too simple to be true.

    1. Duplicate content in a unique presentation allows for repeat use of content you already have.

    2. Articles are ' public domain ' and can be used in ANY website over and over as long as the original copy is not changed in ANY way, including the original author. You can't go reposting someone's article as your own but can post their article from an article directory they have submitted it to with a proper listing on your site.

    They are bringing you traffic so give them the backlink in posting properly folks. This way your sites look completely legitimate and gain ranking. In doing this, you offer an indirect win/win because the writer will stay in business as well and keep publishing content for you to turn viral for him/her.

    3. Target your duplicate pages for individual series of keywords that are alternates for each site. This will cloak your sites from being related in the search engines because they will not show up on the same keyword search page results.

    Examples:

    Page A- keywords; ' unique cat training ',' litter box training ',' kitten house training.'

    Page B- keywords; ' unique dog training ', ' dog house breaking ',' puppy training.'

    Page C- keywords; ' unique pet training ',' house training pets ', ' how-to train your pet.'

    It will look as if every site is completely independent and non-duplicated to the search engines due to it looking as if it is an original content site, where you can deposit all of the same related articles and never have any of them clash.

    here's a freebie for you just to prove I know what the hell I'm talking about...

    Take all of those sites that you generate and do the following.

    BACKLINK THEM TOGETHER!

    You must do it only with your relevant content sites, of course. The way it works is that you must NEVER make the sites have reciprical backlinks..this will kill you rank.

    What you must do is link them in the following order...

    You have five sites:

    LINKING PROCESS: Site 1 links to Site 2, Site 5.
    Site 2 links to Site 3, Site 4, Site 5.
    Site 3 links to Site 1, Site 4, Site 5.
    Site 4 links to Site 1, Site 5.

    In closing, to answer your question, yes, it is just fine to duplicate content on offsite webpages. Don't be fooled by what you see, as it isn't always what you think it is but is guaranteed to be what you person presenting the information is offering as the concept.

    Hope this helps.
     
    Last edited: May 25, 2008
  8. aftershock2020

    aftershock2020 Senior Member

    Joined:
    Oct 19, 2007
    Messages:
    981
    Likes Received:
    478
    Well, it was to cover a lot of ground for the original question. As for the add-on of the ip addresses, it can be worked out very easily, as I just register a private ip address to each site through my hosting managers, never been a problem. Good to note none the less...
     
  9. scb335

    scb335 BANNED BANNED

    Joined:
    Mar 5, 2008
    Messages:
    272
    Likes Received:
    438
    Something else to remember about dup. content filters (expanding on aftershock2020's points about presentation) is that a programmed algo is only going to compare text vs. text from the source code of web pages, so banner colors really aren't making a difference. Though, the image file names will--so if one header was named cats.jpg and the other dogs.jpg they're going to appear different to the program reading the files.

    Also, since the entire page source code is being compared, any and all text that appears in the site template (the header, sidebar(s) and footer of your page) is being compared too.

    In other words, you can put the same 300 word article on 2 pages, but if there's 400 collective words of different text in the templates of those 2 pages then that means less than half of the pages will actually be the same.

    It's why good mashup sites (like Technorati) don't suffer dup. content penalties--even though all of their content is duplicated, the fact that it comes in small chunks from so many different sources means the actual page isn't going to look identical enough to any of them to trigger a red flag.

    And since large sites tend to be template driven, dup. content filters have to allow for some margin of dup. content before sending up a red flag, otherwise no large sites would ever have anything but their main pages indexed.

    There's a lot of debate over what that margin percentage might be, my guess based off my own sites and where I have and haven't suffered apparent penalties is that it's right around that 50% mark.

    So, if your site has plenty of unique or dynamic content showing up in the template regions, then in theory you should be able to build tons of pages using dup. content and still not get penalized for it; providing the text of your template regions makes up over 50-60% of your total page.

    Though, the problem with doing that is you end up skewing the keyword density of the content you're using and will go Batty trying to SEO the pages LOL
     
  10. freudianslip27

    freudianslip27 Junior Member

    Joined:
    Oct 6, 2007
    Messages:
    109
    Likes Received:
    14
    (duplicate post, sorry!)
     
    Last edited: May 25, 2008
  11. freudianslip27

    freudianslip27 Junior Member

    Joined:
    Oct 6, 2007
    Messages:
    109
    Likes Received:
    14
    I think that promotion and backlinks play such a huge part in the rankings of these pages. Look at mashup sites, or even article directories. As long as your site has duplicate content that is from a mixture of places, then there is some uniqueness to the site (like having rss feeds too).

    That being said, I personally wouldn't put a lot of effort in promoting a site that I knew was a duplicate. I'd much rather go the whole nine yards and have unique content that's optimized the way I want from the start.

    Matt
     
  12. pussyback

    pussyback Regular Member

    Joined:
    Apr 5, 2008
    Messages:
    402
    Likes Received:
    80
    yes you got me there!!
     
  13. engine102

    engine102 Registered Member

    Joined:
    Jan 13, 2008
    Messages:
    93
    Likes Received:
    9
    Wow, some great descriptions and information from you guys! Aftershock...thanks for the best description of dup content that I have read anywhere.

    One thing I guess I'm still not sure about is article directories. I have read in different places the theories that since it is an article directory, Google knows that the articles might be the same. The theory with this is that you can submit the same article, unchanged, to numerous directories because you are going after backlinks and traffic from the article sites.

    The other theory is that you will be penalized for duplicate content (not so much you, but the directory) and your article backlink won't count.

    I can't really tell anything from what I can see, however, one article I wrote and submitted to about 5 article directories is only showing up in the serps for ezine articles.

    What's the opinion of the group on article directory submissions?
     
  14. mightybh

    mightybh Senior Member

    Joined:
    Feb 27, 2008
    Messages:
    1,029
    Likes Received:
    1,717
    Occupation:
    CEO
    Location:
    UK
    After doing a few experiments I came up to a conclusion that in order to get penalised for doup content you pretty much need to copy the whole page. So for example, if I was to post an article here there would be no penalty because this page contains a lot of other text and has a different layout in general compared to other sites containing the same article. So the reason why someone might assume that the penalty applies only withing the site is because you will end up with two absolutely identical pages due to layout and structure of the website being identical.
     
  15. aftershock2020

    aftershock2020 Senior Member

    Joined:
    Oct 19, 2007
    Messages:
    981
    Likes Received:
    478
    Well, I have always told my clients that article marketing is actually very simple. The ' guru' tend to over complicate it so that you don't become competition for them and continue to pay for their product of the month...

    How they work is like this...

    You generate the article and then submit it to all of the directories that you want and get posted, hense giving you a oneway, non-recipricated backlink. You do this in 20 directories, that gives you authority links, which give you ranks...due to the fact that ALL directories for articles, blogs and community pages, ( hub pages, squidoo, etc. )...are registared authority sites in google, which is why they are so powerful and effective in the seo arena.

    The key is that you have to ' submit ' only original/unique content to each directory on a regular basis and NEVER duplicate posts of the same article, as they will retract all ofyour postings and blacklist you for spamming them as a violation of their tos, ( terms of use ).

    When posting, be sure to archive the article on it's own page and identifying link within your site...

    example: www.yoursitehere.com/articles/uniquearticletitleshere.

    This is something that google nor the article directories will tell you due to them wanting you to do the work strictly through them but due to the article being placed in your website as content, as well as in the directories. Don't under estimate the effect this will have on your site rank and traffic pull due to the not commonly know google ' relation by relevant ' cross referencing process. This is what makes sites become authorities VERY quickly, which is how you master a niche for it's traffic.

    Hope that answers your question.
     
  16. engine102

    engine102 Registered Member

    Joined:
    Jan 13, 2008
    Messages:
    93
    Likes Received:
    9
    Aftershock,

    So what you are saying is something like this:

    I create a post on my site for, let's say, "Quality Dog Care Products".

    I copy that post, rewrite it and submit it to 20 article directories and have the backlink point directly to that article. So all 20 directories have unique content on them that point to the original and unique article/post.

    Am I right?
     
  17. engine102

    engine102 Registered Member

    Joined:
    Jan 13, 2008
    Messages:
    93
    Likes Received:
    9
    Ok, thanks Xcptn. That is what I had been doing. I would write an article/post on my site, and then rewrite it more broadly and send that one out to the article directories with a link back to the main domain and a link to the actual article.
     
  18. deth_by_uv

    deth_by_uv Power Member

    Joined:
    Feb 26, 2008
    Messages:
    521
    Likes Received:
    416
    Engine102, reread aftershocks post...

    You have to submit ORIGINAL/UNIQUE content to the directories as aftershock stated...

    He's talking about the authority sites when submitting your article which gives you authority links...

    So Xcptn must mean that you can use the same article to submit to other non-authority directories for backlinks...

    However if you're submitting to authorities like ezinearticles, squidoo, hub, scribd etc. - then yes, you must use original (20 different variations) articles per site...
     
  19. engine102

    engine102 Registered Member

    Joined:
    Jan 13, 2008
    Messages:
    93
    Likes Received:
    9
    Ok, gotcha now.

    Submit unique stuff to the authority directories, and the rest you can just use a new variation.

    What are the "authority" article directories besides ezine?
     
  20. ipopbb

    ipopbb Power Member

    Joined:
    Feb 24, 2008
    Messages:
    626
    Likes Received:
    845
    Occupation:
    SEO & Innovative Programming
    Location:
    Seattle
    My empirical testing has shown that duplicate content is determined with the human visible text near the top of the page and the page footprint.

    Google is doing something "like" taking five 13 word hashcodes from the visible text of each page. If you think about it there are 250,000 words in the english language. the odds of 2 pages having the same 13 digit base 250,000 number near the top of each page is astronomical. In actuality only 25,000 of those words are commonly used in webpages. but even a 13 digit number in base 25000 is still astronomical odds. Pages with matching hascodes are considered dupes. The highest ranking dupe goes into the main index and the rest get tossed into the supplemental indexes. Domain name doesn't matter. You can force the original into the supplemental by building links and traffic to the dupe.

    The footprint is used to penalize duplicated websites at different domains. To calculate your footprint perform the following on your HTML:

    1. Delete all HTML Coments
    2. Delete all Script blocks
    3. Delete all attributes in all HTML tags (i.e. word="value")
    4. Delete all text between tags
    5. Delete all whitespace
    6. Calculate an MD5 message digest for the modified html for fast comparisons to other pages later.

    If your footprint is exactly equal to another site's footprint then the higher ranking site goes into the main index and the dupes go into the supplemental...

    Black hats don't care about this because it is easy to get 100k visitors a day out of the supplemental indexes if you have 250K pages indexed for a bazillion different sets of keywords. In fact... being on the first page of results is one of the worst things a black hatter can do because it will get too much attention from people who are invested in doing something about it.

    Be on the 4th page of results for 100K pages.... make money not fame. It is easier and faster.
     
    • Thanks Thanks x 1