I have a question that may seem stupid to some, but I will ask anyway. Lets say you have a white hat site. whitehate.c0m You want to create some cloaked pages on another domain cloaked.c0m Now, if you just redirect the cloaked page to your whitehat site, search engines would be able to tell that you are cloaking, right? What if you redirect the cloaked page to another redirect and put a robot.txt in it? Code: User-agent: * Disallow: / on redirect1.c0m Then redirect that website to your whitehat site. I know that robot.txt is just a suggestion, but don't most crawlers follow it? For even more security you could even redirect again redirect2.c0m cloaked.c0m -> redirect1.c0m -> whitehate.c0m or cloaked.c0m -> redirect1.c0m -> redirect2.c0m -> whitehate.c0m Would that drive in the traffic? Or just get whitehat.c0m delisted from SE's?