1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

how do we hide our pages from search engines? cache?

Discussion in 'Black Hat SEO' started by blackhat50, Apr 20, 2009.

  1. blackhat50

    blackhat50 Regular Member

    Joined:
    Oct 22, 2008
    Messages:
    336
    Likes Received:
    103
    hello how do i make it so the search engines doesn't see our pages and no cahche of the page is saved for our Blackhat Pages?


    thanks in advance
     
  2. lase

    lase Registered Member

    Joined:
    Jan 21, 2009
    Messages:
    51
    Likes Received:
    3
    Location:
    United States of America
    Home Page:
  3. JonesersRX7

    JonesersRX7 Regular Member

    Joined:
    Mar 24, 2009
    Messages:
    201
    Likes Received:
    154
    But... can't people just read your robots.txt file? :confused:
     
  4. davejug1

    davejug1 Guest

    Indeed they can, robots sucks, place the following code in the header of every page you don't want indexing:

    Code:
    <meta name="robots" content="noindex, nofollow, noarchive">
     
    • Thanks Thanks x 1
  5. blackhat50

    blackhat50 Regular Member

    Joined:
    Oct 22, 2008
    Messages:
    336
    Likes Received:
    103
    Thank you davejug.
     
  6. biznets

    biznets Junior Member

    Joined:
    Jan 24, 2009
    Messages:
    116
    Likes Received:
    20
    Why would you want to?:pcguru:
     
  7. dadrian

    dadrian Newbie

    Joined:
    Mar 12, 2009
    Messages:
    3
    Likes Received:
    0
    view my site, and there is a perfectly readable and understandable example
    The spiders cannot see anything in the noindex folders.

    Here it is:

    natural-products-review.com/robots.txt

    the * specifies that the instructions apply to all spiders, including googlebot
     
  8. davejug1

    davejug1 Guest

    Spiders can't but humans can, the robots file is a widely known flaw. Even malicious spiders exploit it