Robots.txt on Amazon buckets

Discussion in 'Black Hat SEO' started by Kokander101, Jul 26, 2016.

  1. Kokander101

    Kokander101 Junior Member

    Aug 2, 2015
    Likes Received:
    The holy land
    Hello guys!

    I'm using Amazon s3 static hosting to diversify my pbn. It runs html static websites. It's dirt cheap to host a website, easy to install and run and you can get different ip and data center for each additional website.

    Now my only problem is to block all those nesty crawlers... In WP I usually using HTacces file but here it's a different story. Do have a good right of the bat txt file, and please tell me were to put it in the bucket.

    God bless you!