1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Blocking Bots - Which method is best in htaccess?

Discussion in 'White Hat SEO' started by Werombi, May 6, 2017.

  1. Werombi

    Werombi Newbie

    Joined:
    May 26, 2015
    Messages:
    34
    Likes Received:
    14
    Occupation:
    MD of catering and recruiter
    Location:
    Australia
    Hi All,
    I have seen 2 methods of blocking bots through htaccess floating around and cannot see the operating difference.
    Can someone please explain the difference?
    If you know of what is best practice that be great too...
    Im using wordpress themes if this make any difference.


    First and currently using:
    RewriteEngine on
    RewriteCond %{HTTP_USER_AGENT} ^robot-name [NC,OR]
    RewriteRule ^.* - [F,L]

    Recently ive seen:
    SetEnvIfNoCase User-Agent "^Enter User-Agent" bad_bot
    SetEnvIfNoCase User-Agent "^Enter User-Agent" bad_bot
    SetEnvIfNoCase User-Agent "^Enter User-Agent" bad_bot
    SetEnvIfNoCase User-Agent "^Enter User-Agent" bad_bot

    <Limit GET POST HEAD>
    Order Allow,Deny
    Allow from all
    Deny from env=bad_bot
    </Limit>