1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Possible Dupe Content Loophole for Wordpress

Discussion in 'Blogging' started by sn0rt, Jul 4, 2013.

  1. sn0rt

    sn0rt Elite Member

    Joined:
    Jun 12, 2012
    Messages:
    1,705
    Likes Received:
    3,502
    Occupation:
    "Most obstacles melt away when we make up our mind
    Location:
    "Knowing is not enough; we must apply. Willing is
    Anyone in the adult niche knows it may be a little difficult to get good (written) content for their sites.


    Here's the theory I'm working on and I'd like to hear some of your guys' opinion.

    I'm thinking about copying a few stories and adding them to my WP site. As we all know, this would automatically trigger a dupe content flag to go up and therefore a possible penalty.

    What's the loophole?


    Well, WP has an option that you can either check or uncheck which basically allows or disallows your content to be indexed.


    In theory, if I paste the content and not allow it to be indexed, it should not trigger anything.


    Any thoughts?
     
  2. Repulsor

    Repulsor Power Member

    Joined:
    Jun 11, 2013
    Messages:
    712
    Likes Received:
    267
    Location:
    PHP Scripting ;)
    Well, this doesnt sound like a loophole to me, but more like a feature by wordpress.
     
  3. sn0rt

    sn0rt Elite Member

    Joined:
    Jun 12, 2012
    Messages:
    1,705
    Likes Received:
    3,502
    Occupation:
    "Most obstacles melt away when we make up our mind
    Location:
    "Knowing is not enough; we must apply. Willing is
    Doubt the creators of WP added this feature so everyone could not get penalized for dupe content.
     
  4. scorpion king

    scorpion king Senior Member

    Joined:
    May 2, 2010
    Messages:
    1,157
    Likes Received:
    2,393
    Occupation:
    Entrepreneur
    Location:
    irc.blackhatworld.com
    This trick will work for sure. But you can't rank your site using it. There is a way to over come this. Allow your WP site to index in search engine. Post copied contents on your 'blog' and in roborts.txt block your blog URL. You can use this on your roborts.txt
    "Disallow: /blog" If you want to use both copied article and orginal article then put your copied contents on one category and orginal content on another. Then Block your copied content category using roborts.txt. :D
    By this way you can still rank in search engines with copied content.
     
    • Thanks Thanks x 1
  5. sn0rt

    sn0rt Elite Member

    Joined:
    Jun 12, 2012
    Messages:
    1,705
    Likes Received:
    3,502
    Occupation:
    "Most obstacles melt away when we make up our mind
    Location:
    "Knowing is not enough; we must apply. Willing is
    Thank you for the input scorpion.

    The intention is not for SEO purposes. The intention behind the content is simply to add more content to the site.

    The hompage is already ranked high in the SERP, I just want to add a few pages to the website and fill it with content. Those pages don't need to be ranked/indexed because the homepage already is.

    If you can, would you be able to break down a little further that robots.txt method a little more?
     
  6. sn0rt

    sn0rt Elite Member

    Joined:
    Jun 12, 2012
    Messages:
    1,705
    Likes Received:
    3,502
    Occupation:
    "Most obstacles melt away when we make up our mind
    Location:
    "Knowing is not enough; we must apply. Willing is
    I actually noticed that this can only be done to the entire blog and not select pages (from what I'm seeing).

    I guess we'd be able to create a new one and link our sites to the dupe content blog that's not indexed.. but then users would bounce to the other page..
     
    Last edited: Jul 6, 2013
  7. scorpion king

    scorpion king Senior Member

    Joined:
    May 2, 2010
    Messages:
    1,157
    Likes Received:
    2,393
    Occupation:
    Entrepreneur
    Location:
    irc.blackhatworld.com
    You can very well block a specific page or folder using roborts.txt

    Disallow: /page1
    Disallow: /hello-world.php
    Disallow: /foldername

    Above is the example. To block specific pages. If you put all your contents on a specific folder then you can block that folder so that contents located in that folder won't get indexed/crawled by search engines.

    You have have a look at here for more information
    http://tools.seobook.com/robots-txt/

    Also use this tool to recheck you have used roborts.txt right
    http://tools.seobook.com/robots-txt/analyzer/
     
    • Thanks Thanks x 1
  8. sn0rt

    sn0rt Elite Member

    Joined:
    Jun 12, 2012
    Messages:
    1,705
    Likes Received:
    3,502
    Occupation:
    "Most obstacles melt away when we make up our mind
    Location:
    "Knowing is not enough; we must apply. Willing is
    Outstanding SK. I appreciate all the info, the resources, and how you broke it down. I can't rep you anymore because I already repped you yesterday for your reply to this thread.
     
    • Thanks Thanks x 1