1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Way to password protect pages from being crawled but still accessed by users?

Discussion in 'Black Hat SEO' started by jpblackhw, Aug 9, 2013.

  1. jpblackhw

    jpblackhw Newbie

    Joined:
    Jul 19, 2013
    Messages:
    14
    Likes Received:
    0
    I have read that the best way to make sure google doesn?t crawl a page is to put password protection on the page. Is there a way when you do this to make sure google doesn?t crawl but still let users access the page without having to sign in with a password?

    Also I have read that there are scripts beyond just a simple robots.txt command that enforce robots.txt and physically block the spider from crawling where it's not permitted in robots.txt. Does anyone know how to do this?
     
  2. seeplusplus

    seeplusplus Power Member

    Joined:
    Aug 18, 2008
    Messages:
    511
    Likes Received:
    163
    I doubt you can force a bot to follow the robots.txt directives, but the simplest way is to use a .htaccess file. You can make one in cPanel and it will require the user to enter a username and password.

    Not very efficient though if you have lots of users.
     
  3. jpblackhw

    jpblackhw Newbie

    Joined:
    Jul 19, 2013
    Messages:
    14
    Likes Received:
    0
    But if I were to do it that way with the htaccess file--in order for a user to see the page they would have to have login info, right?