1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

[Article] Google Hacking: Part 1 of 4

Discussion in 'BlackHat Lounge' started by The Scarlet Pimp, May 1, 2008.

  1. The Scarlet Pimp

    The Scarlet Pimp Jr. VIP Jr. VIP Premium Member

    Joined:
    Apr 2, 2008
    Messages:
    788
    Likes Received:
    3,120
    Occupation:
    Chair moistener.
    Location:
    Cyberspace
    Demystifying Google Hacks
    by Debasis Mohanty (Orissa, India)
    http://www.hackingspirits.com

    Introduction

    Google is the world's most popular and powerful search engine which has the ability to accept pre-defined commands as input and produce unbelievable results. This enables malicious users like hackers, crackers, and script kiddies etc. to use Google search engines extensively to gather confidential or sensitive information which is not visible through common searches.

    In this paper I shall cover the below given points that administrators or security professionals must take into account to prevent such information disclosures:

    1. Google's Advanced Search Query Syntaxes.

    2. Querying for vulnerable sites or servers using Google's advanced syntaxes.

    3. Securing Servers or sites from Google's invasion.

    Google's Advanced Search Query Syntaxes.

    Discussed below are Google's various special commands and I shall be explaining each command in brief and will show how it can be used for critical information digging.

    Command: [ intitle: ]

    The "intitle:" syntax helps Google restrict the search results to pages containing that word in the title. For example, "intitle:login password" (without the quotes) will return links to those pages that has the word "login" in their title, and the word "password" anywhere in the page.

    Similarly, if one has to query for more than one word in the page title then in that case "allintitle:" can be used instead of "intitle" to get the list of pages containing all those words in its title. For example using "intitle: login intitle: password" is the same as querying "allintitle: login password".

    Command: [ inurl: ]

    The "inurl:" syntax restricts the search results to those URLs containing the search keyword. For example: "inurl:passwd" (without the quotes) will return only links to those pages that have "passwd" in the URL.

    Similarly, if one has to query for more than one word in an URL then in that case "allinurl:" can be used instead of "inurl" to get the list of URLs containing all those search keywords in it.

    For example: "allinurl:etc/passwd" will look for the URLs containing "etc" and "passwd". The slash ("/") between the words will be ignored by Google.

    Command: [ site: ]

    The "site:" syntax restricts Google to query for certain keywords in a particular site or domain. For example: "exploits site:hackingspirits.com" (without the quotes) will look for the keyword "exploits" in those pages present in all the links of the domain "hackingspirits.com". There should not be any space between "site:" and the "domain name".

    Command: [ filetype: ]

    This "filetype:" syntax restricts Google search for files on internet with particular extensions (i.e. doc, pdf or ppt etc). For example: "filetype:doc site:gov confidential" (without the quotes) will look for files with ".doc" extension in all government domains with ".gov" extension and containing the word "confidential" either in the pages or in the ".doc" file. i.e. the result will contain the links to all confidential word document files on the government sites.

    Command: [ link: ]

    "link:" syntax will list down webpages that have links to the specified webpage. For Example: "link:www.securityfocus.com" will list webpages that have links pointing to the SecurityFocus homepage. There should not be any space between "link:" and the web page url.

    Command: [ related: ]

    The "related:" will list web pages that are "similar" to a specified web page. For Example: "related:www.securityfocus.com" will list web pages that are similar to the Securityfocus homepage. There should not be any space between "related:" and the web page url.

    Command: [ cache: ]

    The query "cache:" will show the version of the web page that Google has in its cache.

    For Example: "cache:www.hackingspirits.com" will show Google's cache of the Google homepage.
    There should not be any space between "cache:" and the web page url.

    If you include other words in the query, Google will highlight those words within the cached document. For Example: "cache:www.hackingspirits.com guest" will show the cached content with the word "guest" highlighted.

    Command: [ intext: ]

    The "intext:" syntax searches for words in a particular website. It ignores links or URLs and page titles. For example: "intext:exploits" (without the quotes) will return only links to those web pages that has the search keyword "exploits" in its webpage.

    Command: [ phonebook: ]

    "phonebook" searches for U.S. street address and phone number information. For Example: "phonebook:Lisa+CA" will list down all names of person having "Lisa" in their names and located in "California (CA)". This can be used as a great tool for hackers in case someone wants to do dig personal information for social engineering.

    Querying for vulnerable sites or servers using Google's advance syntaxes

    Well, the Google's query syntaxes discussed above can really help people to precise their search and get what they are exactly looking for.

    Now Google being so intelligent search engine, malicious users don't mind exploiting its ability to dig confidential and secret information from internet which has got restricted access. Now I shall discuss those techniques in details how malicious users find information from internet using Google as a tool.

    Using "Index of " syntax to find sites enabled with Index browsing

    A webserver with Index browsing enabled means anyone can browse the webserver directories like ordinary local directories. Here I shall discuss how one can use "index of" syntax to get a list links to webserver which has got directory browsing enabled.

    This becomes an easy source for information gathering for a hacker. Imagine if the get hold of password files or others sensitive files which are not normally visible to the internet. Below given are few examples using which one can get access to many sensitive information much easily.

    Index of /admin
    Index of /passwd
    Index of /password
    Index of /mail

    "Index of /" +passwd
    "Index of /" +password.txt
    "Index of /" +.htaccess

    "Index of /secret"
    "Index of /confidential"
    "Index of /root"
    "Index of /cgi-bin"
    "Index of /credit-card"
    "Index of /logs"
    "Index of /config"

    Looking for vulnerable sites or Servers using "inurl:" or "allinurl:"

    Using "allinurl:winnt/system32/" (without the quotes) will list down all the links to the server which gives access to restricted directories like "system32" through web. If you are lucky enough then you might get access to the cmd.exe in the "system32" directory.

    Once you have the access to "cmd.exe" and are able to execute it then you can go ahead in further escalating your privileges over the server and compromise it.

    Using "allinurl:wwwboard/passwd.txt" (without the quotes) in the Google search will list down all the links to the server which are vulnerable to "WWWBoard Password vulnerability".

    To know more about this vulnerability, see the following link:

    http://www.securiteam.com/exploits/2BUQ4S0SAW.html

    Using "inurl:.bash_history" (without the quotes) will list down all the links to the Server which gives access to ".bash_history" file through the web. This is a command history file.

    This file includes the list of command executed by the administrator, and sometimes includes sensitive information such as passwords typed in by the administrator. If this file is compromised and it contains the encrypted unix (or *nix) password then it can be easily cracked using "John The Ripper".

    Using "inurl:config.txt" (without the quotes) will list down all the links to the Servers which gives access to "config.txt" file through web. This file contains sensitive information, including the hash value of the administrative password and database authentication credentials.

    For Example: Ingenium Learning Management System is a Web-based application for Windows based systems developed by Click2learn, Inc. Ingenium Learning Management System versions 5.1 and 6.1 stores sensitive information insecurely in the config.txt file.

    For more information, refer to the following links:

    http://www.securiteam.com/securitynews/6M00H2K5PG.html

    Other similar search using "inurl:" or "allinurl:" combined with other syntaxs.

    inurl:admin filetype:txt
    inurl:admin filetype:db
    inurl:admin filetype:cfg
    inurl:mysql filetype:cfg
    inurl:passwd filetype:txt
    inurl:iisadmininurl:auth_user_file.txt
    inurl:eek:rders.txt
    inurl:"wwwroot/*."
    inurl:adpassword.txt
    inurl:webeditor.php
    inurl:file_upload.php
    inurl:gov filetype:xls "restricted"
    index of ftp +.mdb allinurl:/cgi-bin/ +mailto

    Looking for vulnerable sites or Servers using "intitle:" or "allintitle:"

    Using [allintitle: "index of /root"] (without the brackets) will list down the links to the web Server which gives access to restricted directories like "root" through web. This directory sometimes contains sensitive information which can be easily retrieved through simple web requests.

    Using [allintitle: "index of /admin"] (without the brackets) will list down the links to the websites which has got index browsing enabled for restricted directories like "admin" through web. Most of the web application sometimes uses names like "admin" to store admin credentials in it. This directory sometimes contains sensitive information which can be easily retrieved through simple web requests.

    Other similar search using "intitle:" or "allintitle:" combined with other syntaxes

    intitle:"Index of" .sh_history
    intitle:"Index of" .bash_history
    intitle:"index of" passwd
    intitle:"index of" people.lst
    intitle:"index of" pwd.db
    intitle:"index of" etc/shadow
    intitle:"index of" spwd
    intitle:"index of" master.passwd
    intitle:"index of" htpasswd
    intitle:"index of" members OR accounts
    intitle:"index of" user_carts OR user_cart

    allintitle:sensitive filetype:doc
    allintitle:restricted filetype:mail
    allintitle:restricted filetype:doc site:gov

    Other Interesting Search Queries

    To search for sites vulnerable to Cross-Site Scripting (XSS) attacks:

    allinurl:/scripts/cart32.exe
    allinurl:/CuteNews/show_archives.php
    allinurl:/phpinfo.php

    To search for sites vulnerable to SQL Injection attacks:

    allinurl:/privmsg.php


    Conclusion

    Sometimes increase in sophistication in the systems creates new problems. Google being so sophisticated can be used by any Tom, Dick and Harry on the internet to find sensitive information which is normally neither visible nor reachable to anyone.

    The only options left for security professionals and system's administrators are to secure and harden their systems from such unauthorized invasion.

    About Me

    To know more about me visit:

    Debasis Mohanty
    www.hackingspirits.com

    I can also be found at:

    http://groups.yahoo.com/group/Ring-of-Fire/
    http://www.hackingspirits.com
    http://www.google.com/remove.html
    http://www.securiteam.com/securitynews/
    http://www.securiteam.com/exploits/2BUQ4S0SAW.html

    ----
    End of Part One.
     
    • Thanks Thanks x 3
    Last edited: May 1, 2008
  2. bill040506

    bill040506 Newbie

    Joined:
    Mar 26, 2008
    Messages:
    20
    Likes Received:
    51
    very interesting
    looking forward to part 2
    cheers
     
  3. blogbd1

    blogbd1 Jr. VIP Jr. VIP Premium Member

    Joined:
    Apr 19, 2008
    Messages:
    551
    Likes Received:
    353
    Location:
    Undetected
    really good.

    thanks
     
  4. element94

    element94 Junior Member

    Joined:
    Mar 14, 2008
    Messages:
    144
    Likes Received:
    17
    part 2 ?......
     
  5. jockey

    jockey Newbie

    Joined:
    Sep 18, 2008
    Messages:
    2
    Likes Received:
    3
    Be carefull with the inurl: string, big G will ban you fast for that.
     
  6. Gwendoleea

    Gwendoleea Junior Member

    Joined:
    Sep 18, 2008
    Messages:
    121
    Likes Received:
    183
    Occupation:
    Everything! but Mom first.
    Location:
    Inside the Matrix
    I'm a bit late in the game as far as replies go but thought i'd thank you anyways. I have been goin out of my mind trying to find a few daycare ebooks with no success! It's makin me nuts. Days now I have been looking.. Hopefully this post will lead me to them. (cross fingers!)

    Thanks again,

    G.
     
  7. Nemath

    Nemath Newbie

    Joined:
    Feb 18, 2011
    Messages:
    6
    Likes Received:
    0
    Cool Interesting...looking for part 2
     
  8. IamNRE

    IamNRE Jr. VIP Jr. VIP Premium Member

    Joined:
    Aug 18, 2010
    Messages:
    4,663
    Likes Received:
    7,108
    Occupation:
    Generate Leads With FB Ads For Just $1
    Home Page:
    Nvm......


    3 yearl old bump......
     
    Last edited: Feb 20, 2011