1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

How To Protect my Pages from Robots ?

Discussion in 'Black Hat SEO' started by madeye32, Jan 8, 2013.

  1. madeye32

    madeye32 BANNED BANNED

    Joined:
    Jul 23, 2011
    Messages:
    539
    Likes Received:
    331
    Hello guys . The story is this. I just created a product that I want to sell at clicksure. The website has wordpress platform. The problem is that I dont know how to protect my pages . I just read some info about some programs that can download and show hidden urls ( Non Indexed Urls In Google ). I just tried one of those softwares and I found my Thankyou Page very easly . Is there Any way to protect my pages from this kind of soft ? A WP PLUGIN or some Code in the HTTACCES ? Thankyou
     
  2. pYraTe

    pYraTe Junior Member

    Joined:
    Jun 10, 2011
    Messages:
    111
    Likes Received:
    49
    put a robots.txt on /


    Here some robots.txt Samples

    Following are a few simple examples of what you might type in your robots.txt file. For more examples, read the robots.txt specification. (In the specification, look for the "What to put into the robots.txt file" heading.) Please note the following points:
    Important: Search engines look only in top-level domains for robots.txt files. So this plugin will only help you if typing in http://blog.example.com/or http://example.com brings up Wordpress. If you have to type http://example.com/blog/ to bring up Wordpress (i.e. it is in a subdirectory, not in a subdomain or at the domain root), this plugin will not do you any good. Search engines look do not look for robots.txt files in subdirectories, only in root domains and subdomains.
    Following are a few examples of what you can type in a robots.txt file.

    Ban all robots

    User-agent: *
    Disallow: /


    Allow all robots
    To allow any robot to access your entire site, you can simply leave the robots.txt file blank, or you could use this:

    User-agent: *
    Disallow:


    Ban specific robots
    To ban specific robots, use the robot's name. Look at the list of robot names to find the correct name. For example, Google is Googlebot and Microsoft search is MSNBot. To ban only Google:

    User-agent: Googlebot
    Disallow: /


    Allow specific robots
    As in the previous example, use the robot's correct name. To allow only Google, use all four lines:

    User-agent: Googlebot
    Disallow:User-agent: *
    Disallow: /


    Ban robots from part of your site
    To ban all robots from the page "Archives" and its subpages, located at http://yourblog.example.com/archives/,

    User-agent: *
    Disallow: /archives/


     
    Last edited: Jan 9, 2013
  3. garrido

    garrido Supreme Member

    Joined:
    Nov 28, 2011
    Messages:
    1,301
    Likes Received:
    341
    Occupation:
    Hacker / Developer
    Location:
    Hackerland
    Use incapsula dot com.
     
  4. madeye32

    madeye32 BANNED BANNED

    Joined:
    Jul 23, 2011
    Messages:
    539
    Likes Received:
    331
    Thanks for the answers....Is there anyone else with another solution ?
     
  5. sgcashcow

    sgcashcow Newbie

    Joined:
    Jan 5, 2013
    Messages:
    14
    Likes Received:
    1
    i am new to this. great info. i shall check it out too