Okay guys, if you get a one page (more like one third page) report on how to index links called "Supreme Site Indexing" its bullsh*% that tries to get you to download at least a trial version of some crap program built with UBot that is full of bugs and has had to be updated like 20 times in the last month on their site (I don't even want to promote that garbage here). However, the idea was good, so here is my twist and what the basics are. Indexing links by pinging them is great and all, but takes a bit of time. Getting them spidered helps a lot more. So the basic idea (which is not mine originally, and is probably old, but I just figured it out by studying the crappy software and what it does) is to create a place to paste all of your links HYPERLINKED so that they can be spidered. What I did was use the yahoo account creator inside senuke (But the creator here would work just fine) and created a blog online. Then I went and took all of my links and pasted them here Code: [URL]http://anonym.to/?http://thephantomwriters.com/link-builder.**[/URL] to get them all hyperlinked at once. Then created a post on the blog with the links (so when published, they are hyperlinked and then spidered). If there are too many links, more than one post is needed of course. The second thing is you have more than one option here.... Because stuff like this tends to be reported, you can simply delete the blog posts within 48 hours after they are spidered, or you can just create throw away blogs for this. Another idea is to take this on a higher level and do it with senuke to create more than a few sites, and then within 48 hours delete them, or just forget about them. The choice is yours, this is just an idea.... "Supreme Site Indexing" Is garbage. You are better off trying to create the U-Bot to do what I was just talking about by yourself.