Hello guys! I'm using Amazon s3 static hosting to diversify my pbn. It runs html static websites. It's dirt cheap to host a website, easy to install and run and you can get different ip and data center for each additional website. Now my only problem is to block all those nesty crawlers... In WP I usually using HTacces file but here it's a different story. Do have a good right of the bat txt file, and please tell me were to put it in the bucket. God bless you!