Case Study: Tier 2 Services from BHW So in October of last year I came upwith my Tier 1 plan for one of my money sites. There were a whole bunch of things I had planned... Web 2.0's, microsites (PBN),video/document sharing, social, etc ? all part of my Tier 1 master plan. You can check that out here. Part of this Tier 1 master plan called for approximately 4k web 2.0's, which had to have authority. I'll admit I scaled this right back. It was scaled back basically because of the cost of content. I don't have a lot of money to spend until I rank better, so 100 Pogiboys (highly recommended) had to work. So having 100 Web 2.0's is great ? now I had to populate them with content and give them authority. I used Rankwyz to drip out content to them, getting them up to 8 posts (largest) now with more scheduled to drip. The next part was the authority and relevance I wanted them to pass on. Here (as in most areas) I had no experience. You can see my post with a shortlist of BST's here, asking for some input from BHW on what was the best. I got a lot of the usual drivel like ?outsource to fiverr?, ?read the reviews? etc ? as if either of those really helps me compare between my shortlist of BST's which was what I was trying to do. Wish I had a button that could delete unhelpful idiots from my thread. Anyway, after a while user IProvideSEO suggested to me to make groups of my web 2.0's and then test the different packages on a group of each, and compare the results. Given the answers my thread endured up to that point, even a simple suggestion like that was awesome. So I decided to take his advice.That's where I came up with this Tier 2 case study which I'm now sharing with BHW. So, that said I want to point something out. I'm not interested in promoting or dissing a service ? I'm just sharing what worked for me for Tier 2 links. If your service is mentioned here unfavorably I'm sorry ? this is just my personal experience I'm sure your service works great for others or if used differently. That out of the way, here's the juicy details and how I went about this Objective: Find out which of my shortlisted services offers the most BANG FOR YOUR BUCK. I emphasize this because I'm sure I could have seen better results by throwing more expensive PR links at them, but I'm looking for the best value of links. Trying to keep the price down. Rating Method: Rankings for my money site keywords. I toyed with a few ideas for how to judge how good my results were. I thought of SEOMoz metrics PA. Ahrefs or maybe TF/CF. Ultimately I decided none of those were what I was after. I remember someone saying that if you wanted to rank #1 for a keyword,all you have to do is get the top 20 sites for that keyword to link to you. Not the easiest task when they are competition for sure, but I felt the point being made was this. A link from a site already ranking for a keyword is highly relevant, and therefore highly likely to move you up for that keyword. Keeping that in mind I decided that my rating method would be this ? ranking my web 2.0's for the keywords I want my money site to rank for. If I can push the web 2.0's to page 1 and 2 and then link them to my money site it should make a huge difference. Boring Details: Divide my 100 web 2.0's randomly into 4 different test groups. Make a list of 25 money site keywords I want them to rank for. Find 4 different services to test out. Try to keep the dollar price roughly the same. Buy the services giving them a test group of URL's and the same 25 keywords Sit back for a month and let them do their magic, then analyze and figure out what worked best One thing I tried to do throughout the entire process was to keep everything the same. This is important to try and isolate the effect of the links purchased, which I have done to MY satisfaction. Which is all that counts. So I kept the content the same (articles heavily spun), and didn't do any other linking to these properties. The only thing I think may have affected this is one of these packages wasn't a 30 day drip. It was afire and forget sort of package. I tried to compensate for this by doing the calculating of serps for this group 30 days from purchase.For the rest of the links I did the calculations 60 days from purchase (30 day drip, then 30 days of time to index/affect serps). So all in all this little study took me 2 months and a few hundred dollars (Pogiboys, Rankwyz, packages, Ahrefs reports), so if you appreciate the sharing of this click that little thanks button. Results: To give a full picture and maybe explain why some of this happened, I'm going to give some background on the research done on each group. After the allocated time, I took 3 random web 2.0's from each group and looked up a few metrics. Namely PA, # of links, # of ref. Domains., # of anchors, highest % for any anchor text, and country of links. Plus I also looked for anything ?bad?. This was done with Ahrefs and Moz. I think at this point I should reveal the test groups and services used. Group 1: Cost $18 300 High PR Blog Posts Dripped over 30 days 300 links / 25 properties should be 12 per property. Not much, but let's see if it did anything. Group 2: Cost $25 400 Blog Posts PR 1 ? 6 Dripped over 30 days. They say 750 links / 25 properties should be 30 links per property. Little more than #1. Group 3: Cost $20 + $10 for dripping 1000 Web 2.0's from unique domains with Tier 2 & 3 to those web 2.0's. They say 1k links / 25 properties is 50 links per property. Not bad, plus it already has tiers built to it. Group 4: Cost $20 ? no drip available 1000 web 2.0's plus 500 social bookmarks and 100 AA article directory posts allegedly. Total of 1600 links / 25 properties = 64 links per property. Nuff said. Without further ado let's dive in.Charts being the most efficient way to convey this kind of data, here they are. Courtesy of LibreOffice (way faster than MS Office or OO by the way ? tried it on calculations with a list of 40k keywords) Group 1 Results: Notes: As you can see from the chart, I didn't find a ton of referring domains, which I would have preferred.The 3 referring domains that I did check out had a maximum PR 0. Not N/A. Also slightly disappointing was one of those properties (#1)having 28 links, but from 1 domain. It doesn't explicitly say it onthe sales thread, but I was hoping the links per web 2.0 would come from different domains. Not all the same as seemed to be the case here. Group 2 Results: Obviously #1 on this chart hadn't shown up in ahrefs or anywhere yet. I'm hoping they build links to this!Assuming they did, but it's possible it's taking some time to show up. Ahrefs doesn't know everything. What I loved about group 2. I checked some of the links ? there were homepage PR's of up to 5!Obviously not page PR. Homepage PR. But still, that's pretty cool.The downside is clearly visible in the low number of anchors,resulting in the really HIGH anchor concentration visible, especially on #3. I will be diluting those anchors for sure if they haven't changed after a month. The number of links was small. I'm supposed to get 3 links per blog from this service. That part seems all right.However the expected 30 links per property haven't showed up in Ahrefs yet. That could be a time thing. Not sure. Group 3 Results: Well. Now we're doing better. With the exception of PA. Which I hardly care about. Anchor text isn't too high. Number of anchors is nice on #1 ? 10 anchors is OK though I did give a 25 keyword list. Only other notes worth seeing is that the network links came heavily from country specific regions, UNLIKE the last 2 services. UK, Iran, Taiwan, Russia ? those are where the links were coming from on these samples. Group 4 Results: I REALLY LIKE THIS. PA is up there(though it is useless), # of links is where it should be ? 64 each? and more. Anchors are OK ? highest is 20% with over 12 anchors.Interesting metrics are the country ones... Russia, Iran, Thailand,Venezuela and Slovania. For whatever that is worth. One thing I didn't like was that somehow I got what look like Arabic anchors for one of these tests properties #3 I believe ? I'm not sure if that is supposed to be LSI or what. Those are the packages and the metrics from them. That should help everyone make a semi-informed decision.Do with it what you like. Saving the Best for Last I mentioned above that I would not be using some sort of metrics as a final determination for how I would rate success. I would determine the success of a package by it ranking for the 25 keywords I gave them. Here's the moment you've all been waiting for... Couple comments here. Top to bottom is serp ranking. Bottom is better. Left to right is my keywords. Group 4 dominated the serps. Not sure if that is for the volume of links or what. I think it has to do with the fact that Group 4 got the most anchors out there for links (expected given the higher #?). Group 2 hardly showed up to the party.Which is disappointing considering I did find a PR 5 homepage on the one blog it was posted on. Homepage isn't blog post page I understand, but I thought it would be worth something. As far as the Web 2.0's went, Group 4 was clearly better. Group 3 though was not totally wasted. It showed up in a few places. As far as the blog posts go, IF I was to go with one, which I'm not certain I would, then Group 1 would be better ?at least this worked better for me. A few notes might help to give context here. I searched all my keywords Google.com with exact (in quotations) match. Competition for these exact match went up to 700k+ where my web 2.0's were ranking. Some pages I was up against Ask.com,About.com etc. There is one keyword term where I have 6/10 on the first page (including 1,2,3), for 144k exact match. Long story short ? it's a good start I think. Anyway guys, that's all I got to share.Opening the floor up now for discussion If you liked this case study, and if Isaved you some time and money doing this on your own, feel free toclick the thanks button, buy me beer, or donate millions to my fund.