Hola! I've been testing out scrapebox for a week now and read countless guides and videos on using the most out of this wonderful software. I have a plan to use SB carefully and effectively and need your input. I've seen a couple of threads on people claiming to get sandbox or even heavy google dance after blasting thousands of backlinks to their money site..and my opinion it's very possible because Google do "see" everything, especially when thousand of backlinks suddenly pointing on a site (although it's an old site). But that's not the point here and I believe one can overcome this with 2 things : 1. Constantly build backlinks (1k a week) for a couple of months - not much + make sense + looks natural. 2. Keep adding more content in the money site (2-3 articles/week) BUT! I'm getting paranoid on using my money sites (I don't have many) by building backlinks directly to them. Therefore I'm planning to use buffer sites which are web 2.0 properties like : 1. ezinearticles 2. hubpages 3. squidoo 4. ehow 5. other big article directories and link them to my money sites (with all the anchors and keyword density). Then build thousand upon thousand of backlinks on each of them. Such web 2.0 web properties don't get sandbox and serve 2 advantages : 1. Pass all the backlink juices to money sites. 2. Will rank high as well in search engines because they love web 2.0 sites! That's the plan, and I can blast as many backlinks as I want - constantly BUT! There is a problem, as I did a few tests before commenting manually using proxies on a few blogs, and most of those comments was blocked by Askimet (I hate this fu**** plugin!) because maybe they filter web2.0 property sites - and I read this somewhere as well. I'm not sure what's the case here but that's the BIG problem in this plan. Hoping for your input and ideas on this. Thanks!!