1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Does anyone have a robots.txt file to block competitors from crawling my site?

Discussion in 'Black Hat SEO' started by Holla1, Nov 18, 2014.

  1. Holla1

    Holla1 Regular Member

    Joined:
    Aug 26, 2014
    Messages:
    230
    Likes Received:
    4
    does anyone have one of these? i want to keep competitors from viewing site info.
     
  2. BallticklersFC

    BallticklersFC Newbie

    Joined:
    Sep 15, 2014
    Messages:
    35
    Likes Received:
    16
    Location:
    DownUnder
    cant be done via robots.txt brother...thats only for the good cops.
    You need to learn crash cours on .htaccess tutorial..is only way.
    you can implement a whitelist firewall or blacklist block list only through .htaccess.

    Mind you if you are using wordpress ..some of the all in one security plug ins have a block by ip list..if you knew whom was crawling or scraping your site..you caould get there ip or ip range and paste it in and they would be 403,ed.
     
    Last edited: Nov 18, 2014
  3. ThopHayt

    ThopHayt Jr. VIP Jr. VIP Premium Member

    Joined:
    Jul 25, 2011
    Messages:
    5,996
    Likes Received:
    2,032
    Those files are only SUGGESTIONS to robots. Nothing will prevent a robot to crawl your site if it wants to disregard. Google obeys, and other reputable engines too... but spying sites have little reason to bother honestly identifying themselves or not peeking. IMHO all this does is puts up a red flag for possible blackhat shenanigans on your site. Unless you have an IP you want to block it's a fool's errand.
     
  4. marcin88

    marcin88 Newbie

    Joined:
    Nov 17, 2014
    Messages:
    18
    Likes Received:
    0
    Case is difficult. Your competitors always find the way. I doubt, that can be done, in order to work.