1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Link Cloaking for BlogSense

Discussion in 'Cloaking and Content Generators' started by adbox, Nov 5, 2009.

  1. adbox

    adbox Power Member

    Joined:
    May 1, 2009
    Messages:
    658
    Likes Received:
    107
    Home Page:
    Hey guys,

    I'm in the process of developing link cloaking for blogsense, and what I am thinking of will work like this...

    original link is replaced with link like so:
    http://www.domain.com/dir/ref.php?ref=uniquecode

    the server script ref.php will look up the code, and then redirect to the appropriate matching url.

    Is this sufficient for a basic link cloaking feature? Do you have any other advice?

    adbox
     
  2. adbox

    adbox Power Member

    Joined:
    May 1, 2009
    Messages:
    658
    Likes Received:
    107
    Home Page:
    Will this process get my users in trouble with affiliate programs?
     
  3. Guaji

    Guaji Regular Member

    Joined:
    May 28, 2009
    Messages:
    441
    Likes Received:
    226
    I donĀ“t think it will get in trouble, but how old users will be able to get the announcement of the "Updated version"?

    I use other cloaking method and is fine, if you see one of the biggest site like retailmenot.com they cloak all their aff. links.
     
  4. adbox

    adbox Power Member

    Joined:
    May 1, 2009
    Messages:
    658
    Likes Received:
    107
    Home Page:
    blogsense has auto update capability, so if i update on my end they get updates
     
  5. adbox

    adbox Power Member

    Joined:
    May 1, 2009
    Messages:
    658
    Likes Received:
    107
    Home Page:
    The feature was implemented, but as I'm learning a little more about cloaking I want to redirect google and yahoo search engine bots back to my homepage. To do this I need to detect the bots and there are a couple of ways to do this.

    By determining if the user agent has a browser or by recognizing that the ip of the agent is a known spider ip.

    The browser can be faked, and I believe google would fake a browser to get the most inormation/normalacy possible. In fact. Google probably runs spiders faking many different types of browsers.

    Facebook detects browsers and redirects accordingly and other sites do too. So this is just my guess.

    That being said the ip detection method looks like the best way.

    Does anyone have a better answer?
    Where can I find an updated ip list, anyone know?