Eli's posts are from 2008 or around that time. At that time what he explained worked. Today it doesn't work the same. Today having a site with tens of thousands of pages give another site a few links doesn't help much. Giving many links (thousands) is probably even worse.
While the aproach with DB/madlib sites is a good long term, large scale strategy, it requires massive amounts of sites and has huge costs. I don't think that you can have each site cover it's costs by itself. Worked a while ago, probably not a good idea now.
Using AdSense for monetization of a network is an epic fail. Maybe not immediately but when you'll get hit, you'll get all the sites hit at once. Find other methods to monetize - something that doesn't have a footprint. That means any affiliate code that is the same on 2 sites is a potential problem. Maybe not but I don't like to risk. CB, Nextag, AdSense, YPN, etc all that have just one tracking code are potentially dangerous. Amazon works great because you can have as many tracking codes as you wish. But maybe not great for the types of sites you have in mind.
The footprint avoidance strategy from my link networks thread still works (you asked my by PM) with the update that SEO hosting is problematic. SEO hosting is crap. Massive dangerous stinky crap. Google knows those IPs are all from a SEO host. Even the best hosts are crap. Only way to be 100% safe is to buy regular shared hosting (even very cheap at $1-2/mo with very little space and traffic is better). Also domains must have fake WHOIS info. Some can be with whois privacy but most should have real info (but fake and different from every other domain). Obviously don't use google analytics. You can use Piwik but not in default mode where you track all sites from one central server because tracking link will be the same on all sites and thats a footprint. You have to hack Piwik and create a modified tracking system where when a user loads a page instead of Piwik recording that on central server by the URL or on local host it will perform a cURL request in backend to central server. That curl request being behind the curtain is undetectable and not a footprint.
About the site types... i wouldn't take that approach. I would instead build niche related sites with 20-100 pages each. Some on WP some on Joomla, some on Drupal some on other CMS'. Vary everything. You need a lot of sites for medium to high competitive niches. 50 is a good number to start with. Target however 100-200 sites/niche (if big/competitive) all on different shared hosting accounts. Yes it is very expensive but is the only truly safe way. From google and any competitor that might analyze the network.
Ideally use expired/dropped domains (Godaddy auctions and the like) that have good backlinks. It is not enough for the domain to have PR. It must have actual links and those links must be strong enough to make up that PR otherwise you can expect PR to drop on next update. I could write an entire book on choosing expired domains with PR... I've seen courses explaining how to do it and they are kinda crap because they don't understand fundamentals and don't tell you about important things (they don't know). Baically you can find a PR5 on GoDaddy for $200-$300. A real PR5 with actual links to make up that PR5 would probably be won at the auction at over $1000. As a matter of fact $1000 for a real PR5 is not much. You make the math. You can pay $200 for a PR5 that gives you no real juice, just an illusion or you can pay $1000 for a real PR5 that has loads of juice. Or, you could develop fresh domains from scratch. PR doesn't really matter. PR is just an indicator not the actual value. Depending on a lot of other things a PR3 site could actually be more beneficial than a PR5 because PR is not the only thing that matters.
I hope that helps. I would have to literally write 50 pages to explain all but i guess this will have to do

Good luck!