1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

How do you make the index page different for redirected domains and for manual type-ins?

Discussion in 'Black Hat SEO' started by Craig8, Aug 4, 2013.

  1. Craig8

    Craig8 Newbie

    Joined:
    Jul 22, 2013
    Messages:
    32
    Likes Received:
    1
    How do you make it so that the index page on a website (A) will be different for redirected domains (B, C, D...) than if someone typed in the domain manually?

    Say I have a website (A), and it the domain will load differently depending on if the person arrived via a specific domain than if they typed in the domain manually?
     
  2. dinkish

    dinkish Power Member

    Joined:
    Apr 19, 2013
    Messages:
    689
    Likes Received:
    159
    Not really sure understanding what you are saying to be honest, but you would be able to redirect via the .htaccess based on referrers.

    My logic is that if someone typed in directly, it wouldn't have a http referrer, so you could load the page accordingly, where as if they were referred by specific domains (or all other referrers) it would redirect them to where ever.

    If you were going to do that, you;d want to ensure that Googlebot was indexing the most appropriate domain though.
     
  3. Craig8

    Craig8 Newbie

    Joined:
    Jul 22, 2013
    Messages:
    32
    Likes Received:
    1

    I want Google to index a certain index page on the domain (A), but I want some my other domains that are redirected to the domain (A) to show a different index page that Google won't see.
     
  4. TheeAriGrande

    TheeAriGrande Regular Member

    Joined:
    Jul 14, 2013
    Messages:
    270
    Likes Received:
    151
    Location:
    Candlestick Park
    I'm on my mobile right now so I can't really write everything out, but as soon as I get back I'll tell you how to do it if nobody has beat me to it.
     
  5. Craig8

    Craig8 Newbie

    Joined:
    Jul 22, 2013
    Messages:
    32
    Likes Received:
    1
    Great. Thank you TheeAriGrande.
     
  6. dinkish

    dinkish Power Member

    Joined:
    Apr 19, 2013
    Messages:
    689
    Likes Received:
    159
    Probably asking for trouble if you do that quite frankly. I believe that's referred to as cloaking.

    Google "http_user_agent googlebot". I've read that after the first crawl the user agent may not be listed as googlebot. Don't know if that's just BS or not though.

    Check out the cloaking area to see if you find what you're looking specifically for, otherwise query the above and come back with more specifics and I'm sure someone can give you a more educated response.
     
  7. bartosimpsonio

    bartosimpsonio Jr. VIP Jr. VIP Premium Member

    Joined:
    Mar 21, 2013
    Messages:
    9,559
    Likes Received:
    8,158
    Occupation:
    ZeMiner.com
    Location:
    ZeMiner.com
    Home Page:
    That's a gray area really because several legitimate websites do that. Some websites persent a version of the page with the search kw's highlighted when you click through from a search result. Others show no ads on the landing page when you come in from referrer X.... So there are plenty of legitimate uses for cloaking and selective content delivery.

    Having said that all you gotta do in that case is detect the referrer or the current host and present selective content based on the data gathered. If you can't do that, I suggest you have a look at the Hire a Freelancer section here because surely some is up for the job.
     
  8. dinkish

    dinkish Power Member

    Joined:
    Apr 19, 2013
    Messages:
    689
    Likes Received:
    159
    That's where my head is at. My only concern would be if Google crawled from the referring page, and didn't identify itself truthfully.
     
  9. stratocentric

    stratocentric Junior Member

    Joined:
    Mar 12, 2012
    Messages:
    122
    Likes Received:
    35
    I have heard that g will send two different bots and compare the results...no verification though.
     
  10. dinkish

    dinkish Power Member

    Joined:
    Apr 19, 2013
    Messages:
    689
    Likes Received:
    159
    That's what I was saying. You're referring to the user agent. When it crawls sites it identifies itself as Googlebot. Whose to say it doesn't truthfully identify itself to verify after or prior?

    I only knew about user agents because it was very easy to change them in firefox to gain access to indexed membership areas years back. If I could do that without any knowledge of how or why it worked so long ago, I'm pretty sure Google would be sneaky enough to not be blatant and find people trying to manipulate their methods by masking their user agent with automated capability.
     
  11. Craig8

    Craig8 Newbie

    Joined:
    Jul 22, 2013
    Messages:
    32
    Likes Received:
    1
    I'm willing to try this out. I'll see what happens with Google and report back.

    So, how exactly do you do this (code, where to put the code, etc.)?
     
  12. dinkish

    dinkish Power Member

    Joined:
    Apr 19, 2013
    Messages:
    689
    Likes Received:
    159
    You'll want to create a file and put it on your root directory of the domain that you're concerned that Google will crawl. This is a plaintext file called ".htaccess". You can use notepad to to save it as this even.

    TheeAriGrande said he'd contact you. If he does, and it's not a simple block the user agent Googlebot, it would be worth verifying.

    I'm not going to type everything out for you because I'm not certain exactly how to prevent Google from pulling the same shit you're trying to do, because quite frankly I don't trust them. I'm shadey, but they're one step ahead of me and shadier.

    Google the following:
    .htaccess
    http_referrer
    http_user_agent Googlebot

    By querying those three, you should get a good understanding of what you are wanting to accomplish.

    You may be able to simply Allow and Disallow based off a wildcard, differentiating between / and an index file, and only allowing indexing of specific directories from there. I really don't know what the best course of action is. It's not a new idea, I like it a lot, but it's not new. This was conceived the second the 90's search engines gave up on meta keywords and thought about keyword stuffing on top of that.

    If my money was on the line, I'd really take the advice given, and then research this shit to get my own educated opinion and then way whether it was worth it or not.
     
    Last edited: Aug 4, 2013