O.k. So I had an idea and I am starting a thread to document it. If it helps someone great. If it has no value I apologize but it is better for me to put it in writing as I do it. I have been playing with SB for a few weeks now and of course reading as much of BHW as I can. One thread I read by Bobby Love was real interesting. To sum up Mr. Love's thread he theorizes that it's not exactly how many links you have but rather how many links you have from unique ip addresses that really matters. I have seen this in my own Niche as one competitor has more links and an older domain but less unique IP links than the top spot. I have to say I agree with BLove's theory. If you want to see how many unique IP links your competitors have the easiest and cheapest way to do this is to look up BHW member LEWI and buy a Majestic SEO report from him. Getting one or 2 is far cheaper than buying a subscription to Majestic. To date I have had better luck getting links approved and having them stick using blogengine blogs. IT seems that people use this software to throw up a blog that they quickly abandon and leave on autoapprove. I even often get emails while doing a blast that say something like I appreciate the comment you left on my blog insert name of blog here. LOL they Don't even bother to set up their blogs all the way. Plus everyone else is firing away at Wordpress getting banned right and left by akismet and having tio buy .info domains to use htt redircts to keep firing away at WP blogs. It's like they feel warm water and don't realize it's warm cuz everyone pissed in the lake.. So the problem I was coming up against was how do I find as many unique domains as possible powered by blogengine? If you use 1-100 keywords and then eliminate duplicate domains you wind up with fewer domains from a huge list of harvested blogs. Earlier today I harvested around 12 million urls and wound up with 17,000 unique domains.. not good and a waste of harvesting bandwidth. Plus we are looking for bang for the buck and typical blogengine approval on fast poster is what 20% anyway? Again for the purposes of this experiment I am in need of as many unique domains running blogengine as possible. I feel I needed more keywords and less results per keyword. The BHW member MAruk was giving away keyword lists to anyone who asked and if you added all the keywords he has given out together it was something like 800,000 keywords. One thing I thought there though was that too many people had asked him for similar types of Niches. After all this is an IM forum of sorts. It's only normal that we would have similar interests. Plus the keywords he offered are from the scrapebox wonder wheel scraper. When I did my own experiment with this tool the Keywrod Boston gave me around 27,000 keywords that returned around 14,000 unique domains if I recall correctly? Maruk is a great guy and his Keyword lists are great but not exactly right for our experiment. So I was thinking where in hell do I get a huge list of quality non repetitive keywords? I googled at length trying to find a text file that had every word in the dictionary line after line with no definitions. The closest I came was a text file that after removing duplicates and 's and other extraneous characters netted me 528,030 words in the English Language. Pretty much everything from A to Zoo. So currently I am harvesting using 95 connections,around 110 good public proxies and t using the custom footprint " powered by blogengine.net" I am only looking to get 11 results per keyword. usually you want as many results as possible but I am out for unique domains and hopefully unique ip links not 400 links from one site to mine. Theoretically I should get 11*528,030 =5.8 million results. I wish I had a larger list of proxies to start with and a faster internet connection is always better.. ideally I should be on a windows vps for this experiment but when I went to purchase one from xsserver tonight they were out of stock.. fuckers. I am harvesting 17 urls a second and my calculations show that at that rate I wont get my 5.8 million results until Monday evening at the earliest. Hopefully at least half of the results are unique domains. let's make the math easy and hope 3 million are unique domains.. (hoping as many people as possible are running that bullshit asp crap blogengine). If 3 million are unique domains and out of that 3 million I only get a 1% stick rate I'll wind up with 30,000 back links..with anchor text. Now praying that all 30,000 of those sites aren't hosted between fatcow,hostgator and bluehost I would HAVE to wind up with 1,000 links on uniqe ip's minimum. That many unique IP links would put me about 3 times as many unique ip links ahead of my number one competitor. You notice I am not setting my goals too high.. I am also thinking of incorporating another users idea of putting in my gmaps citations into the comments. I forget this users name but he claims that when G finds your exact info about your local business anywhere on the net it is likely to include that info as a review in your G places listing. I am about to hit 500,000 results so I am still a long way away from starting my comment run. I am willing to hear out any input..