# how good is this linking strategy for blog farm?

#### red1eaf

##### Registered Member
blog farm experienced guys ...,

below is a linking strategy for a blog farm. This current diagram is for 3x3 matrix. The same thing can be scaled to any length.

At present i am planning for a 5x5 matrix ( means .. the blog farm will have 25 blogger sites)

i am testing this blog farm on free hosting using blogger instead of my own domains. The reason being , its free and Google loves blogger.

Pros :

1. simple but beautiful, not sure if it is effective
2. Can be scaled to any size of blog farm

Cons:

1. Each node has same number of incoming & outgoing links. The big G might detect the footprints.

2. the link structure completes loops. Though each of the links is a one sided link, but the loop is complete at 4/5 level. Big G might very well detect it. He is not all that stupid.

You're looking at this situation from the perspective of graph theory. That's a very, very good first step. As you've already mentioned, in your example the linking pattern is highly regular and the number of node-to-node connections is homogeneous. Regardless of what specific technology Google uses to identify greedy communities, one can approach this issue with detection avoidance in mind. Specific methodology is hard to discuss in real terms without getting technical and whipping out Mathematica, but, as I think you've discovered for yourself, graphs can be illuminating.

For example:

Imagine that each arrow on this 7-node network represents a link between a different site. As you can plainly see, there are a variety of different linking relationships between the seven different nodes, but none that appear regularly. If we imagine that this network represents a network of "fresh blogs," the total possible linking weight of each blog is equal to 1. We assume that the weight of a single node (1.0) must be evenly distributed across all outbound links. For example, in the case of Node 1, there are three outbound links, so the weight of each link has a value of roughly 0.333 of Node 1's total weight. Node 2's links each have 0.5, Node 3's 0.5, etc.

We must ask ourselves a single question; does this network exhibit bias? Or, in linking terms, does one site get more juice than the others?

Network bias is simply weighted preference distributed across the network. In this scenario, we want the balance to be "tipped" (so to speak) in a single site's direction. To do this, we must find whether our schema has a deliberate path wherein all nodes are touched once and only once. This path is called a Hamiltonian path.

Luckily for us, in this example it does:

Follow the red arrows from Node 1 to Node 4 to Node 7 to Node 2 to Node 3 to Node 6 and finally to Node 5. Specific preference has been established in a relatively non-regular network. If all sites are weighted evenly and their juice is distributed evenly, Node 5 would have more juice to give than any other node on the network.

In terms of detection, as the number of nodes increases, as does our ability to obscure this directionality. One method of obscuring unnatural bias is to mimic actual networks. Not too surprisingly, bias is shown in real networks as well, but generally towards single nodes from many different networks. Imagine an even larger graph than the example. A graph where Node 5 is linked to by 3 or 4 other networks similar to the one in our example. If it were linked to by 3 other networks by only two links (as is the case in our example) a site with only 6 inbound links would carry considerably more weight as a node than any other node on its whole network.

I'm not going lay it all out here, because this is not a simple concept and I don't have the time nor presence of mind to break it down completely. However, it is absolutely possible to develop the graphs of networks like these without knowing a lick of graph theory. If you're interested in exploring these ideas more, look into Hamiltonian paths, adjacency matrices, and the work of Dr. Robert Tarjan.

Good luck!

Note: I'll try and put something a little more comprehensive and clear on our blog in the next few weeks.

So, how effective is this? What is more effective?

A lot of questions comes me in mind, the only thing is to try these things out and see results in long-term.

I think I'm going to work with oldenstylehats linking method, looks more natural then the first one.

Also I will get some backlinks for each site from foren, blogs, partner sites, etc. so there aren't just the blog-farm backlinks.

Everything on different Free Host's for different IP's - CaffeteinContent for some articles, also.

How should I link from these sites? As a Link in the Sidebar?

The linking schema I've described absolutely is effective, though many people believe that applying that much theory is overkill. While I don't completely disagree with them, I do personally find empiricism comforting. The effectiveness of using separate IP's is hotly debated, but I would err on the side of caution and try to keep each site as discernibly separate as possible. I would also personally either write or pay for the content of the blogs to be written rather than using a Markov process content "spinner."

Since someone using a similar method might have only a few outbound links on each site, I would put the links into the content of the blog itself, surrounded by contextually relevant content to increase the strength of the links. Sidebar links are very easy to identify. They're good for getting content indexed because of their prominence in many blog themes, but not necessarily as well received as links in actual content.

Oldenstylehats,

believe me it took some time to go back to my fundamentals and understand all that your post. Thats was gr8. Appreciate your time.

Now, Looking at the graphs and if assume that the entire strategy holds good,

it is best to create 3-4 independent blogfarms, for example one network on blogger, 2nd network on wordpress, 3rd network on hubpages with out first linking them.

As a last step.. link the blog netwoks in such a way that, the linking strategy gives a biased linkjuice to our target website.

does this work ??

Instead of making each "farm" a collection of nodes hosted on a single provider, you'd be better off spreading the links out across multiple domains/hosts in a non-regular manner. In other words, ideally you don't want 4 networks that consist of:
• One WordPress.com blog
• One HubPages.com blog
• One Blogger.com blog
• One Blogetery.com blog
• One 100WebSpace.com blog
• One Blogr.com blog

It also might be helpful to have the end node ("juiced node") be a privately hosted blog, though that is, once again, personal preference. You should have the most control over the central node as it is essential to your larger campaign's success.

Wow OSH. Great post as usual. I'll be anxiously awaiting the full post.

quite a bit of graphics and break down oldenhats... Keep up the good work of spending alot of time educating us with alot of eyecandy.

oldenstylehats -

Interesting post. I feel like I just stepped into an episode of "The Big Bang Theory"! For us mathematically challenged people, is there a website that will graph a "Hamiltonian path" when you enter different numbers of points? Or maybe a primer that explains this a little bit?? I read the wikipedia articles on the three subjects you mentioned, and I have to admit, I'm as confused as ever. ANY assistance would be gratefully accepted and appreciated!

hmm, is this what they call blog farm? this is one hell of a strategy. can google easily detect this?

oldenstylehats -

Interesting post. I feel like I just stepped into an episode of "The Big Bang Theory"! For us mathematically challenged people, is there a website that will graph a "Hamiltonian path" when you enter different numbers of points? Or maybe a primer that explains this a little bit?? I read the wikipedia articles on the three subjects you mentioned, and I have to admit, I'm as confused as ever. ANY assistance would be gratefully accepted and appreciated!

I wrote a reply earlier today, but either BHW or my browser borked and I lost it. In a nutshell, path generation is probably a little too processor intense for most web servers and most predicative models lack the richness necessary to give any real insight. With regards to the math, your best bet is to take college math courses. In graph theory, even the primers need primers if you don't have a firm understanding of the foundational axioms.

I'll write more later.

EDIT:
hmm, is this what they call blog farm? this is one hell of a strategy. can google easily detect this?
Lots of different things fall under the moniker of "blog farms." This is one potential strategy for one. There are many. Making it hard for Google to detect is the idea behind doing it this way to begin with.

Last edited:
When I'm link building I like to do a bit of chaining in several different ways:

- my network ==> On a given site I might surround link with good super relevant and properly keyword filled text. I believe this helps especially when your site is NOT origonally relevant.... By that I mean you can go out of your way to maintain your site as usual but add a "niche" to it so to speak.

EX) www.cars.com [original site]

Link at bottom of page: " I love fishing, fishing is the greatest sport in the world. Where the link goes to: www.cars.com/fishing/fishing.html
and the body of the page has minimum 250 words and somewhere in the body the word fishing is linked out to external site 1.

Other tactics are relevant blogs + relevant comment and an outbound link to destination site where possible. Some nofollow blogs are ok as well. You might not get ranking directly but it will help you get indexed faster... More backlinks = more chances and possible occurrences of getting crawled.

I also tightly control which links and how many go outbound on my site. Additionally, I use nofollow tags where appropriate to help direct ranking flow.

... Well those are just a few techniques but hopefully it helps you get down some basics.

Really good post, will have to bookmark this there is some good information here!

Even though I still don't understand this pattern of 1 and 0.33 and 0.5 in PR Theory (eigenvectors are supposed to be used to stress analysis and not with Google =]) I like this link strategies.

I was tired of listen that link bait was the better stuff you could do for getting better links. =]

Thanks oldenstylehats.. ( I don't see you too much around here)..

When I'm link building I like to do a bit of chaining in several different ways:

- my network ==> On a given site I might surround link with good super relevant and properly keyword filled text. I believe this helps especially when your site is NOT origonally relevant.... By that I mean you can go out of your way to maintain your site as usual but add a "niche" to it so to speak.

EX) www.cars.com [original site]

Link at bottom of page: " I love fishing, fishing is the greatest sport in the world. Where the link goes to: www.cars.com/fishing/fishing.html
and the body of the page has minimum 250 words and somewhere in the body the word fishing is linked out to external site 1.

Other tactics are relevant blogs + relevant comment and an outbound link to destination site where possible. Some nofollow blogs are ok as well. You might not get ranking directly but it will help you get indexed faster... More backlinks = more chances and possible occurrences of getting crawled.

I also tightly control which links and how many go outbound on my site. Additionally, I use nofollow tags where appropriate to help direct ranking flow.

... Well those are just a few techniques but hopefully it helps you get down some basics.
I totally agree. Relevance is absolutely an important part of creating effective link networks. With respect to what I wrote in my first post, contextual semantic relevance is very commonly one of the major "mitigating factors" used when assigning weight (value of a link) to an outbound link with respect to its recipient. I'll try to make this distinction a little more clear later, but its important to remember that link weight is not a unitary constant; it shifts with other on and off-site variables.

vinniffa said:
Even though I still don't understand this pattern of 1 and 0.33 and 0.5 in PR Theory (eigenvectors are supposed to be used to stress analysis and not with Google =]) I like this link strategies.
If you're familiar with eigenvalues/vectors, you might find the subject of adjacency very interesting.

olden this is giving me all sorts of evil coding ideas

edit: we all know that on topic content links are better but would it matter if it was only blogroll links in any major sense?

From old Monty Python skit: Mr. Gumby "My brain hurts!!!"

Last edited:
olden this is giving me all sorts of evil coding ideas

edit: we all know that on topic content links are better but would it matter if it was only blogroll links in any major sense?
All the research we've done leads us to believe that the context of a link does matter, but only to a degree. To what degree, no one is exactly sure. We have a generalized idea and I'll get into what that general idea is in a little detail at some later date. It's important to keep in mind that for a search engine to serve its purpose effectively, almost nothing with its retrieval algorithms can be static. The reason for this has little do with concerns about "gaming" and a lot to do with how information is stored and retrieved.

It's important to keep in mind that for a search engine to serve its purpose effectively, almost nothing with its retrieval algorithms can be static.

couldnt agree any less. :yield:

couldnt agree any less. :yield:
What do you mean? You don't agree because it makes it harder for us or you don't agree because you don't think that's true?