1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Useful Method to Restrict Access to Your Folders! Very Useful for Clickbank Products!!

Discussion in 'Clickbank' started by IamKing, Jan 2, 2010.

  1. IamKing

    IamKing Jr. VIP Jr. VIP

    Jan 24, 2009
    Likes Received:

    We all know there is chance of indexing the things that are in the folders by SE's so we will use the robots.txt to restrict access to bots for those folders.

    Sometimes by using these robots.txt file easily can enter into those folders and use the information provided in that.

    So, it is a bit headache in both ways, if won't use them in robots.txt file SE will index the files in it, if we use someone will access the important files in that folders.


    Use .htaccess file in the folder which needs security.

    1. Open note pad and enter the text mentioned below

    2. Save the file as htaccess.txt and upload to the folder to which security is needed.

    3. Now change the file name as .htaccess

    Now we can't go to the folder directly and also there is no indexing of the files inside the folder if we use robots.txt. But you can use the files by giving entire path.

    This is mainly useful to restrict access to the images folder and our thankyou pages folder in clickbank products.

    Hope this information may useful to all.

    Thanks & Reps will be appreciated.

    Thank you
    • Thanks Thanks x 2
  2. BlackSeng

    BlackSeng Elite Member

    Mar 5, 2009
    Likes Received:
    That is just one of the ways.

    Note to everyone: You do NOT need your html or folder name to be "Thankyou". You can name it "blalba652432" or whatever.

    Instead of using robots, simply add this code between the <head> and </head> of your thankyou page: <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

    NOTE: If you do not use this meta code nor exclude it in the "robots.txt"... your thankyou page will GET INDEXED.

    Then for the robots.txt, just add this simple code:
    Disallow: /*.PDF$
    Disallow: /*.ZIP$
    Disallow: /*.DOC$
    This code tells googlebot not to index any pdf, zip, or doc files found in the server.

    You can even use a HTACCESS code to prevent hotlinking of files. This way, it will prevent users from other websites downloading your files via a direct-link. I'm lazy to post it. I did teach someone here before... the thread title is called "Clickbank Protection" or something.
    • Thanks Thanks x 1
    Last edited: Jan 2, 2010
  3. sebamann1

    sebamann1 Junior Member

    Oct 19, 2009
    Likes Received:
    student / pwning the world's ass
    Edinburgh, Scotland
    thanks guys.
    knew about both methods, but this is still a very useful share (reminder for clickbank veterans and good news for newbs)
  4. hyperlite

    hyperlite Regular Member

    Nov 24, 2008
    Likes Received:
    I think the easiest way is to just put a blank index.html file in the folder so that they can't browse it even if they know its there.
  5. janets

    janets Newbie

    Oct 4, 2010
    Likes Received:
    Hi am going to be a first time vendor soon. I already added meta tag part.

    Do you recommend placing thankyou php on a separate folder instead of root level?

    I believe i need to place .htaccess file on thankyou folder with options - indexes text as above.

    What abt robots.txt. is it a separate file that i should place on the thank you page folder?

    I added this code on top of my thankyou page html and named random characters thankyou.php...Is this enough? (of course replacing all the red info with details)

    <?php // yourdeliverypage.php
    function cbValid()
    { $key='Your Secret Key';


    if ($cbpop==$xxpop) return 1;
    else return 0;
    if (!cbValid($rcpt, $time, $item, $cbpop)) {
    // redirect
    header ("Location: anyurlyouwant/");