1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Cloaking idea I had

Discussion in 'Cloaking and Content Generators' started by Cloakster, Aug 4, 2007.

  1. Cloakster

    Cloakster Newbie

    Joined:
    Aug 4, 2007
    Messages:
    10
    Likes Received:
    0
    I have my own cloaker I'm constantly adding to and revising and I was thinking about adding a feature where you can choose which search engines to cloak for and which ones to treat normally. Does this feature already exist on other cloakers? I know it didn't on SEC when I tried it a few years ago. I was thinking this would be handy if you want to, for example, cloak for all search engines except G*oogle so you could rake in traffic from all other SEs and display Adsense ads. I don't know if G*oogle would ever find out since you would be indexed normally with them, that's just an example. People may want to selectively cloak for other reasons.
     
  2. meegwell

    meegwell Registered Member

    Joined:
    May 11, 2007
    Messages:
    57
    Likes Received:
    2
    w/ sec you define the ip list so you can exclude known G (or whatever) bots w/ a custom ip list. i dont think you would want to redirect an SE bot (ie treat it like a human visitor) - is this what you are saying by "cloak for all ses except g"? by default, cloaking would be letting the se bots "in" to the cloaked site w/ spider targeted content and sending humans to visually appealing sales sites - so if you choose not to treat one se like the others, you would be sending that se to the sales/money site.
     
  3. Cloakster

    Cloakster Newbie

    Joined:
    Aug 4, 2007
    Messages:
    10
    Likes Received:
    0
    Sorry if my explanation was vague. In the example I gave, I would treat G as I would any other visitor while cloaking for all the other SEs. I thought it might be a nice feature for those people who are afraid of getting the infamous G*oogle ban. Of course it could work the same for any major SE.
     
  4. meegwell

    meegwell Registered Member

    Joined:
    May 11, 2007
    Messages:
    57
    Likes Received:
    2
    " I would treat G as I would any other visitor while cloaking for all the other SEs"

    every cloaking system does this I think. remove G ips or user agents from your filter list. THe bigger question is why? Why would you redirect a G spider to an un-spiderfriendly page in the first place if the whole point of your cloaking was to make spiderfood for SE spiders?

    Good cloaking apps like Phantom and SEC stay on top of th elatest SE bots and keep the lists updated. THis is very important so you dont accidentally redirect an SE bot.