If we use less sites on a tier, google will have less data to decide to slap the site. So I'm researching about slightly different kind of method. See below. In the image, there are 2 backlinks for each site, except leafs. 2 might be very low yes but if we increase number of tiers a lot, what would happen? Have you heard of any similar experiments? (2 links for each site)^(10 tiers) = 1024 sites in all tiers. 3^10=59049 That would mean a huge link juice loss cuz of many tiers but it also means it will be much harder to detect, because google can't afford to trace back to 10 tiers for all sites. It requires extremely huge amount of resources which Google prefer not to spend.