Hiding Duplicate Content on my site from google

Discussion in 'Black Hat SEO' started by fatjack'sblackhat, Aug 22, 2013.

  1. fatjack'sblackhat

    fatjack'sblackhat Power Member

    Jul 16, 2008
    Likes Received:
    college student
    Ok, I have a review blog and it is getting decent traffic, nothing crazy but the traffic is all relevant to my niche. I have a friend who has a landing page that converts at about 5% and I want to use his landing page, he said I could use it. I dont want to be penalized by google...

    Is there a way I can hide the landing page from google? Better yet, is there a way I can just redirect my traffic straight to the affiliate offer but not have my redirect affect my rankings? That would be ideal. Has anyone ever tried this? I would probably get more conversions if I could just redirect everyone to my offer but not the google bot when it crawls my page. Don't just say "robots.txt bro" because thats not very helpful. I know it has something to do with robots but everywhere I look is vague or outdated. Thanks
  2. Endire

    Endire Elite Member Premium Member

    Mar 27, 2012
    Likes Received:

    Do you need the landing page to be indexed in search? For example will it be attached to a PPC ad or some other method that will funnel traffic to it? If so, I wouldn't worry about having the same page.

    Even if the idea is to rank it, you could probably get away with just having the same page in search. Is there a lot of copy on it?

    I wouldn't try hiding it from Google as that will backfire on you. If it isn't supposed to be found in search anyway go ahead and add the file to your robots file.

    Hope that gives you some guidance.

  3. dinkish

    dinkish Power Member

    Apr 19, 2013
    Likes Received:
    Sounds like he wants it crawled to get search traffic. I wouldn't redirect googlebot either, as Google very likely doesn't identify itself "correctly" every time it visits your site.

    If you want to throw caution into the wind, you can do it with .htaccess after the page has already been crawled, assuming the content is actually indexed despite being duplicate or not unique enough in the first place. Look into .htaccess and HTTP_USER_AGENT googlebot. You'll find the rewrite conditions you need that way.

    If you're not wanting google to index this, and are just getting backlinks from other sites or spamming that link, use a robots.txt file to try to exclude it, otherwise add the no-index\no-follow robots meta tag in case it crawls through externally anyways. It doesn't sound like you're going this route though.