To protect and secure your eBooks and protected files from getting indexed in Google and other search engines, follow the steps: 1. Create a separate folder for the files you want to protect on the web server in the public_html directory. You can place eBooks, reports, videos and other downloadable stuff in that folder. Let us call this folder 'download'. 2. It?s time to protect the index page of the folder we created. We create a blank html file and save it as index.html and upload it to the 'download' folder on our web server. Now, if somebody tries to access the 'download' folder on our web server, all they will see is a blank screen. 3. Now, when we upload books, reports, etc to that folder, its name should be kept descriptive and its name should be succeeded by some random characters. Eg.: Suppose the name of the book is 'Affiliate-Army'. When uploading to the web server its name should be changed by adding some random characters at the end like 'Affiliate-Army-1xr98.pdf Adding random characters at the end will make it impossible for thieves who try to steal by guessing the name of the eBooks in the URL. 4. Similarly, rename the download page so it would become impossible to guess. Eg.: download-5g6h3.html thankyou-b125pk.html 5. Add the following code to the source code of the download page after the <HEAD> tag. The content should look like this: <HEAD> <META name="robots" CONTENT="noindex, nofollow"> (This line is to be added.) <TITLE>.......</TITLE> </HEAD> This code tells the search engines to stay away from your protected stuffs. 6. Delete any old download pages from the web server. The only place to have the eBooks, reports, etc and the download page is the 'download' folder we created earlier. 7. Now, the last part is to add a robots.txt file. Create a new text file and name it as robots.txt Type the following in the file: User-agent: * Disallow: /download/ (this is the name of the folder we created) User-agent: Googlebot Disallow: /*.pdf$ (this is the extension of our eBook or reports) Save this file and upload it to the root of your web server. The location of the robots.txt file should be as follows: http:// yourdomainname.com/robots.txt This robots.txt file tells the search engines to stay away from your 'download' folder and the pdf's. You are done now. Before using the new setup, test it by placing a test order to make sure everything runs smoothly. The above setup can prevent most of the thefts. But this system is still not fool-proof. The theft might continue with people sharing your eBooks with others by uploading it to file-sharing sites or by giving put your download URL. But this is the least we can do to protect our hard work. Other solutions to prevent thefts are: -- Password-protecting downloads pages. -- Password-protecting eBooks. -- Zipping pdf's into zip files as zip files are not indexed by search engines. If you haven't followed the above steps, chances are that your download page and eBooks will be indexed sooner or later. You can check to see if your eBook or download page is indexed by typing the following code in Google: site: yourdomainname.com This setup might look very basic to some but there are many others who need it and will benefit from it. Edit: If you have some money, buy DLGuard or similar scripts. It will make your life much easier and will secure your data much better. Regards, Blackhat_Boy..