I have seen some talk lately about web pages being in Google's "cache" as opposed to just being "indexed". Scrapebox has the ability to check if a list of URLs is indexed ("Check Indexed" button) and it works great. Is there a way to use SB (or any other way that anyone knows of) to check if a list of URLs (not just one) is in Google's cache? As an addon to this, I would like anyone's take on being "indexed" vs being "cached". Please answer based on experience, testing, or facts - not based on what you "think" or "feel". I don't really care what you "think" (if you know what I mean). I just want the facts!