how good is this linking strategy for blog farm?

WOW is all i gotta say i feel like im getting a php in offpage SEO marketing... i just want to make some niche sites and make some money and have fun doing it... the seriously i do understand but how can any one simplify all of this...
 
I think all you have to do is this:

1. Create a bunch of websites, and link them all one after the other with the last one being your money site (#1 in this case) something like this:
bubbl.us/view.php?sid=300255&pw=yakFceWqarfPQMTZGMEc3SS5DQ3pyRQ

2. Create a bunch more links amongst the sites to make it difficult to find the money site being favored. Like this:
bubbl.us/view.php?sid=300256&pw=yakFceWqarfPQMTZISDJqQ0FNcHM4TQ

I couldn't eliminate the spaces near the end of those links, but if you take them out the URLs will work
 
Last edited:
In general, how much content should go on your Web 2.0 sites? Can I just leave them completely blank except a link with the correct anchor text?
 
thanks everybody for the usefull info that I found inhere
I'm finally getting an idea of blogfarming
 
I've read this entire thread, and I've found out some great things. But these things have raised some questions:

Is the main purpose of this technique to take 7 sites that are already established and ranking well, and link them together in a way that makes one of them rank higher?

If so, does that mean if I wanted to start from scratch, I'd need to set up 7 blogs about "red widgets", and SEO each one of them, get a lot of good links, etc first, before making them into a farm?

What kind of anchor text is ideal here? Surely we couldn't put "red widgets" on every link, on every site... could we?


Is it possible to create these blog farms at the start of sites, and point the "greedy" node to a money site, in order to help it get a good start? Maybe create 7 web2.0 properties, create the path (sorry, forgot the word.. Ham-something), then link the strongest one to the TLD of the money site?

If so, could we make 10 of these blog farms, and do the same thing?

Would it be good or bad to get links to all the nodes of the blogfarm with something like xrumer, or some auto social bookmarking software?

Please answer these questions Olden, or anyone else who is knowledgable about this. Thanks. :D
 
Thanks oldenstylehats. You've given me the last peice of an enormous puzzle- the puzzle of an all encompassing SEO strategy which is really complex but all of us SEO enthusiasts seem to take it for granted.

And through the complexity we had forgotton the need for randomization of link paths, I am so much more confident than before because of knowing this.

It's worrying that so many people replying to this thread can't understand the basic concept he has given. Hell I was getting confused at one point because people were replying about not understanding it which made me think it's more complex than it really is, but in fact it's quite simple.


This is so important, it's probably the second most important thing right next to making a farm in the first place in terms of link strategy. And probably third overall in SEO importance, I'd say the second being keywords.
 
I apologize. I've written a lot of different responses to the posts in this thread over the last few days and decided not to actually post them until I can figure out a good way of explaining all of this without using misleading language. My response above was by far the most reactionary and mean-spirited. I shouldn't post when I'm grumpy. I also somehow missed your post with the multi-dimensional list and apologize for that.

That said, here is the problem with this method that you outline with this graph:
307303123935cb840771modrh1.jpg

If I'm reading it correctly, (you should really fire up a visualization tool) while it is absolutely multi-dimensional, as you stated above, it does not exhibit any where near enough complexity to avoid even the simplest detection algorithms. It also assumes, potentially based on an over-simplification that is entirely my fault for introducing, that weight is evenly distributed at all nodes and that this distribution is temporally linear. It is not. In fact, the even distribution as discussed in my initial post assumes "perfect conditions," which never occur in the wild. It was merely done for simple explication. Also, weight is not distributed on these networks in a way where a node that received a .5 weight, would then distribute a weight of 1.5 if it only had 2 links. This is another place where my initial explanation might have been misleading, but the fact of the matter is that on the majority of the large network only rarely will we see node connections (edges) that distribute weight greater than 1. Since PageRank relies on a similar algorithm, presumably this is why an overwhelming number of sites have a PageRank of only 1. This is also why, if we disregard quality, there is no direct correlation between the number of links and the ranking of a page (an old SEO myth.) Even the original PageRank algorithm was far more rich and complex than that.

This is separate visual perspective of the funnel system you created:
funnel-net.png


It requires 3 linear passes, from 9 -> 1, 6 -> 1, and 3 -> 1. Due to the linearity of the system, a good greedy network algorithm could identify this within 10 guesses (at most.) Increasing the complexity by making each node on this graph representative of a large network, potentially one of many described throughout this thread, will decrease the potential for discovery considerably, especially if you use one like the one I outlined to obfuscate network bias.

Remember that even the discovery of a small greed sub-network within the larger network would cause a massive, negative weight redistribution. This is why it is essential for each outward node sub-network to be reasonably complex. Though one can't intuit too much about how Google deals with greedy networks, if they deal with it like anyone else in academia, they only automatically search for a set number of "best guess" passes and shove "more probable" networks off into a separate system for further evaluation. Path discovery of bias of intentionally complex node networks based on Hamiltonian cycles, even in the best equipment, can take many, many days to identify. It is much easier to develop them, than it is to pick them out. Because these directed paths aren't the only paths in the networks, each pass requires the algorithm to do a considerably larger analysis of all other outward edges from a single node. It's easy to see when it's graphed, but much, much harder to posit from a data set.

Furthermore, I did outline a multi-dimensional graph in the first post, I just didn't create a graph for it because I figured it was pretty self-explanatory. An example, using 2 separate sub-networks similar to the first one outlined:
linknet.png


This is not an ideal network, as you'd want each sub-network to have a variance in the number of nodes or (ideally and) in the nature of the path. Even if any specific bias couldn't be culled out of the data set, a good algorithm can identify sub-system similarity in networks very easily, which is extremely unnatural for a "organic" network, especially repetitive similarity (site navigation of single sites, aside.) Please keep that in mind. I'm not going to outline a hundred different scenarios. It isn't in my best interest on so many levels, time being one.

Remember that the name of the game is not solely maximizing the amount of juice, but also maximizing the longevity of ANY amount of juice. We'd be dumb not to assume that path analysis isn't occurring. Even if they slow their devaluing and de-indexing process to make it harder to get an idea of how they're doing the analysis, the process is still happening over the long term.

I remember reading this thread a long time ago, but not really getting it. I've recently been making blog farms, and I remembered that I had this bookmarked, and it makes so much sense to me now.

With that being said, I do have one suggestion for this method. It's one that I think is being overlooked, and would make the networks look more natural.

In the above example, let's pretend that you have 5 networks all pointing to the target. All of these networks are of different sizes, and different linking strategies, but they all use the Hamiltonian path to direct link juice.

Now, you are wanting the "target" to appear to be relevant to a certain keyword. You do this by taking the greediest site on the network, and directing a link to the target site. If the target site was really relevant, and really an authority site, this wouldn't be what real links would look like.

My Proposal: Create networks like this, and take the last node of the Hamiltonian path and direct it to your target site. Once that is done, also take a few random nodes of a few random nodes, and link these to the target. This will make your site appear to be an authority site on the subject. Even if this is not the ideal way to get the most link juice, it abides by a simple principal: If a website is important and relevant, a lot of people will think so, and they will in turn link to it.

Actually, I think that this would make the farms even harder to notice, because the links to the target will throw off algorithms.

Another thing that could help is linking some of the nodes to already established authority sites, but not direct competitors. For example, if your site is about fishing bait, you could link to a site that makes fishing rods. It's relevant, and an authority site, but it wont harm your main site.

So, Olden, what do you think?
 
There is definitely a ton of room for modifications. This is more of an abstraction of a larger process than a method in and of itself, if that makes any sense. Distributing links to your target sites and sites outside of your purview is probably wise, as long as it isn't overdone. Wikipedia, popular news sites, etc are really good examples of targets for outbound links that would be expected given the broad range of topics they cover.

Experimenting is the key to success with this sort of stuff. Try a few different types of link networks and see what works best. Evolve your process by implementing the best parts of each network type. It is a constant process, just as it should be. As Google's ability to separate the wheat from the chaff improves (and it is improving every day) these sorts of operational methodologies will be what differentiate good blog networks from bad ones.

I hope that helps.
 
Wow. This is an amazing thread. I currently have up about 11 sites, 10 of which I want to point to one site. I was thinking about making a link wheel, but after reading this thread, it seems that a link wheel mite be too noticable. Im thinking instead of arranging those 10 sites in a wheel, all pointing to one site, i should follow oldens method here. Thanks olden for the great posts.

One question though, with people trying to make their ink wheels more absctract, and besides the fact that they usually use web 2.0 properties, what else is different between a link wheel and a blog farm?
 
I'd suggest to NOT use olden's link diagram, actually. It's merely an example. Instead, learn how to create the H-path,and make your own diagrams, and keep them different, always. Leave no footprint.

Thanks for the response olden, thats pretty much what I expected to hear.
 
since im no math pro, would this be a good strategy to come up with a good h-path so i'm not using oldens exact diagram: take my sites, lets say 8.
1) Connect slave sites (one way links) randomly from site 1, through all the sites, ending up at the money site. n
2) put 1 - 2 links on all slave sites going to other slave sites (randomly?)

Is that about right?

Also, a related question if someone could help me out: where can i get good free web hosting? i know of 000webhost, are they any good? Any other recommendations? THank you.
 
I read this thread a long while back and I remember it was the thread which affirmed to me that BHW was more than just another SEO forum. Thank you oldenstylehats.

I have one question that needs clarification regarding the effectiveness of the hamiltonian network. To discuss it, one thing needs to be clarified: the purpose of creating one or more of these networks.

Either:

a) To get the target to rank only
b) To Dominate one or more keywords with the target site, and other nodes in the networks.

From reading the thread, it looks like (a) is the intention.

Now, in most competitive niches, the top players tend to have a great many links. Sure, a quality link is worth 10 low quality links (or more), but I really feel that if I built 10 of these networks and pointed them at the target, I would have to build between 70 and 100 slave sites just for 10 links. Good links, sure, but with only 10, I can't see that it will be enough of a return for the work done.

Clearly, other link sources will be required. However, the very essence of these networks is subtlety and evading detection. By supplementing these networks with other more obvious link sources and networks, would the concealing effect of the hamiltonian network become weakened?

I'm dabbling in all sorts of networks at the moment, some subtle and complex, others completely blatant. To be honest, at the moment, it seems to me that hammering a new site with comment spam and waiting 3 months for it to blossom out of the sandbox is easier than trying to build like this and creating all the content. I've had several "blatant" link network target sites emerge from the sandbox and keep great serps for many months - more than long enough to pay for the effort put in.

I'd really like to know what you think about these issues.

Micallef
 
To discover your final destination juice point all google has to do is follow the linked list to the end.

In olden's example he has an irregular graph containing several web sites. The aim in creating your initial graph is to make it look as natural as possible. Once you've created a graph/blog farm that contains links that are as natural looking as possible...

...the next question is "How do I direct the link juice somewhere, we'll call site X, without google knowing that I have this juice generator?".

NP-Complete problems are easy to VERIFY but not DISCOVER. This means we can create them easily, but google can't find them easily.

This is why olden's method of creating a blog farm is the best option.


FINALLY -- The crux of this post -- ahhh... duh!!!

How to link and provide "link juice" to a site (usually a money site) without 'G' knowing about it...


WELL NOW!!! It's now quite obvious that some graphs/graphics that were provided (usually containing 9 squares ;) ) provide NO camouflage whatsoever.

I see said the blind man as he picked up his hammer and saw! :tee:
 



Wow, BHW really is a great forum...what an awesome thread. I really appreciate your input OSH (and everyone elses)

I didnt realize we had so many REALLY REALLY smart people on here, most of the maths stuff is way over my head. But i think i get the gist of the whole thing.

BUT (and correct me if im wrong here) looking at the diagram above, all that we have really achieved is two good quality back links to our main money site(properties 6 and 13)? And to the best of our ability prevented the big G from finding out we created these two links ourselves, and for ourselves.

Now i understand that we need a random "network" for each link with random "interlinking" from random sites or web2.0 properties. But all of this basically just improves ONE of our properties, which we will then use to link to our main money site, thus creating one good link. So the entire concept with this is creating our own good quality links??

I might be ignorant here guys, please enlighten me. But this is a HELL OF A LOT OF WORK considering you physically have to go and build each link to your site. So if i wanted 100 links to my money site, i would have to build 100 of these networks all randomly. That might end up being 600 or 700 properties i would have to go and build...and all for just 100 back links?? And not only that, to really ensure the properties used in each of these networks provide the "quality effect" we're looking for, they would each need to have unique content on them as well??

Given it is quality, relevant back links...but would you not have the same effect by just creating paul and angela type links, writing relevant articles, leaving comments on relevant high pr blogs and social bookmarking? All these links would come from high pr and very relevant pages. The only difference being that you dont have to personally ensure that the pages that your links are coming from are good quality..or have a high pr, this is already done for you. Each of these pages already have their backlinks?? And not only that, but you would have a far more diversified source of links, and it would not take you half the time.

As i said guys, i really don't mean to step on anyone's toes or offend anyone. And i really appreciate this kind of thread. I might just be totally ignorant here....but is all this work really necessary to build quality back links?
 
Excellent thread.

But how useful could it possibly be to create only one or only a couple of these powerful backlinks for your site to make it look as an authority site in your niche? With countless factors being involved, only tests could show how useful these efforts actually are. For small niche sites creating a couple of these links might be useful, I guess. But otherwise, it could just end up being pure pain in the ass creating tons of properties, even if you use spun/rewritten articles and only those web 2.0 sites with the quickest registration processes. The network might at least increase traffic, too, and not only serp rankings.

If you're trying to rank for highly competitive keywords and got enough time and money to actually go for outranking extremely old, well optimized, huge authority sites.. then it could possibly be useful to create tons of these networks.

The 'easiest' alternative to the whole process of building your own network of thematically related web 2.0 properties would simply be actually creating backlinks on already existing, relevant authority sites. The whitehat route might actually take less effort in this case.

Or 'standard' link building with a mix of blog comments, article backlinks, high pr backlinks, edu links, gov links, web directories, social bookmarking, videos, rss submission etc.

But I might test a single network using a hamiltonian path and a bunch of properties in order to see how effective this could be for an adsense/affiliate niche site with easy to moderate competition.

This is one of the best threads on BHW I've read so far and I spend countless hours around here. Very, very nice. I'll read more about hamiltonian paths now and maybe I'll build a network using one of the sample graphs I find.

If I'll test it and get any obvious results, I'll let you guys know about it.
 
So, how effective is this? What is more effective?

A lot of questions comes me in mind, the only thing is to try these things out and see results in long-term.

I think I'm going to work with oldenstylehats linking method, looks more natural then the first one.

Also I will get some backlinks for each site from foren, blogs, partner sites, etc. so there aren't just the blog-farm backlinks.

Everything on different Free Host's for different IP's - CaffeteinContent for some articles, also.
 
I love this thread so much, it was one of the very first threads i read when i first visited this site, and has shaped so much of my work. i honestly must tip my hat to oldenstylehats for this amazing insight, i only hope that by posting something new to this wonderful thread, newer members can benefit from this wisdom, and maybe, just maybe a mod sees this thread and makes it a sticky

btw, just thought i'd add
the info in this thread is why my personal blog network that i build for my clients has not suffered from the various panda updates
and disregarding this info, being unaware of it, or just poor implementation of it is why blog networks are getting de-indexed
 
Last edited:
Back
Top
AdBlock Detected

We get it, advertisements are annoying!

Sure, ad-blocking software does a great job at blocking ads, but it also blocks useful features and essential functions on BlackHatWorld and other forums. These functions are unrelated to ads, such as internal links and images. For the best site experience please disable your AdBlocker.

I've Disabled AdBlock