1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

distributed link network

Discussion in 'Black Hat SEO' started by tommy777, May 10, 2010.

  1. tommy777

    tommy777 Newbie

    Dec 22, 2009
    Likes Received:
    is this blackhat seo ?

    This post is going to be pretty unusual for me and may well annoy many people but its a lazy Sunday morning and following a few conversations I have had this week I thought I would explain the theory behind building a scalable and distributed link network.

    Sound useful? OK well settle in this is going to be a long and rambling stream of thoughts

    OK so before we start lets lay out why and what we are looking to achieve here. If you take a look at the link profiles of any competing set of sites its almost always obvious that if they compete on terms beyond the home page they will be doing this from some manipulation of the anchor text pointing at their internal pages.

    To manipulate this internal anchor text you need to be able to control what text people use when linking into your site and this means we have to: -

    A: Find places that allow us to control the anchor text
    B: Buy links
    C: Have lots of sites under our control

    Now sites that allow us to control the anchor text are either hard to come by or easy for any search algo to discount.
    Buying links is costly and frowned upon (Ohh grow up its Googles fault you have to do it so get on with it )
    Having lots of sites under your control is obviously a better option (We will do the maths as we go to show you that)

    So how would one go about building a network of sites that we can have under our control?

    Funny you should ask, I will share how I would do it (Not that I do this sort of thing

    We DO NOT want any of these sites to be traced back to us in any way either detectable by the search engines algo OR by the manual spam teams

    We are going to be able to control everything about these sites from our central control panel.
    We are going to utilise very cheap hosting
    We are going to outsource the content creation

    Lets get started
    We will first need to create a database on our central server

    We need this database as we are going to send all the content out to the remote sites in our network as each page is called on the remote server. This will allow us to have complete control over hundreds or thousands of sites without having to log in and out or maintain CMS’s across lots of remote hosting accounts.

    Here are the fields youll need for each table
    First the table that holds our content for the sites
    This is the table that will hold our content for each page on the site

    Pagemetadesc (optional)
    Live (y/n)

    This table holds our sites and allows us to know when each one is due for renewal and also what fake whois we used to reg them with (Come on grow up )


    Live (y/n)


    OK you get the idea here – we are creating a database that will hold all our remote site details and also act as a database of the content on these sites as well as the links we will insert into them.

    So now what we need to do is to get some sites.
    We can buy some if you like or we can register some. Bear in mind though that if we reg some then this will become a longer term plan than buying them.
    If you are registering them then please also make sure you give fake info for the whois (ON EACH DOMAIN) we do not want all the network to be traced back to us via who owns the damn things. It is fine if you want to register using the same credit card as this isnt public information (Unless there is a court case one day but thats not going to happen again…)

    We will also want to potentially vary the registrar we use but thats more for paranoia reasons than practicality.

    So lets assume you now have a bunch of domains we can use and you have added their details to the database. NB The database will be held on our central control site (Dont make this the same as your main domain, some site that is anonymous is great – think of it like a fake shop front on a gang bosses hideout)

    Now we need somewhere to host them. We are going to use very cheap hosting. This makes our cost base low but means we will have to be creative on how we serve the sites.

    Heres a search for cheap hosting on G

    Loads of companies with prices like £5 a year or £1 a month. Most of these will be fine for us.

    The minimum requirement for us is that the hosting deal we buy for each must be capable of: -

    FTP access
    Ability to call a remote file by file_get_contents (we have other options but…)

    Most hosts will allow that lot though.

    Got that? OK we are ready to go!

    On each remote site we will be uploading one index.php file and a .htaccess file.
    The index.php file will be our ‘handler’ and will respond to and deal with every request the remote site gets. The .htaccess will give the index.php file the ability to do that.

    The way the flow works is this: -
    Visitor comes to someremotesite1234.com
    They request someremotesite1234.com/about_our_company.html
    The .htaccess steps in and tells the server that index.php handles all requests for pages on this site.
    The index.php page takes the full url and passes the domain and page to the central content server on our secret domain.
    It passes: -

    Now the central site takes over and pulls from the central DB: -
    The content for this page and passes it through the link functions on the central server that will parse the content and add any links we specify centrally for this page.
    Its also checks and finds the template that we use for the someremotesite1234.com domain and puts the content into this template.

    The central site then returns all this as a finished page to the remote site that serves the page to the user.

    Phew – still here?
    That may sound very confusing but it really isnt if you have been doing this type of thing for a while. Bear with me and we might have an opportunity later to learn in more detail, for now this is just an outline of the system anyway.

    So you may have noticed that we passed the content for this page on the central server through some functions that added the links if we had any.

    The way that works the total string of content we had for the page requested is passed to a function that looks in the links table and checks for any text that matches a record in there.

    So on our example the about_our_company.html page may have a paragraph like: -

    At Johnson Brothers we have been serving the Coventry area for nearly 30 years, providing excellent customer service for anyone looking for holiday insurance. Contact us today if you would like us to arrange a quote.

    This gets passed to the links function and it looks to see if it has any matches
    In the links table we have specified that if on this page we see “for holiday insurance. ” we replace it with

    This allows us to control all the content AND all the links for these remote sites through one central site.

    The template simply allows us to create the site templates and layouts on our central server and therefore always only upload one standard index.php file to any new sites we create.

    All that happens with the templates is when the page is called it also calls the template which looks a little like: -

    And then the template functions simply replace the tokens @@[email protected]@ with the content from the DB (Sidebar is just a loop through all the pages we have for this domain)

    This way we can have a unique template for every site (NOT JUST UNIQUE CSS FFS) and we can project all coding styles – so we can have sites that look like they are wordpress powered, sites that are table based, sites that are plain css.

    OK so thats the real basics of how we serve the pages.

    How do we get the content?
    We outsource it of course, we pay for paraphrased content.
    Its important to note that these sites arent intended to rank well on their own and we arent creating content for traffic reasons we are merely creating content for the purposes of having a real looking organic site that will get indexed and can provide us with a link.

    So we might have sites that talk about hobbies, pets, holiday guides, fake engineering companies, fake organisations etc etc

    We will link build to the network as well as we go from sites NOT in the network itself.

    Do not create sites with the same whois
    Do not create sites on the same host
    Do not create sites too close to any other site
    Unique template on each domain
    Unique content on each page
    Do not interlink (If you do know what you are doing)
    Do NOT leave footprints

    Footprints are any way in which the search teams or algo can tie your network together.
    Don’t put the same analytics on each domain (If you add Google analytics to any of these I will personally come round and shake you)
    Don’t re-use the same templates
    Don’t add the same structure or footer info

    Trust me once you have had several thousand sites burnt all at once by Googles spam team youll learn that lesson.

    Don’t add lots of links to each site

    Bear in mind that it probably costs you somewhere north of £40 to place a paid link with a site and often this level of links means you get something on a sidebar or footer or added in a way that is easy for the search engines to detect and kill.

    If you control it all you get links in the BODY of the content on sites that arent likely to be found easily and shouldnt be able to be tied together easily by the search teams.

    Its also worth noting that the cost of these links is approx: -
    Hosting / Year = £7.50
    Domain / Year = £7.00
    Content / Site = £15.00
    Updates and Mngmnt = £ 5.00

    £34.50 ish total per site

    I am thinking of arranging a chat session somewhere for this if anyone wants to learn it in more detail.

    Happy spamming.

    Related posts:

    My Adsense Flat A few times recently I have mentioned that I am...
    Mass content mathematics For anyone who used to make mass content sites and...
    Quixapp is a great tool for outsourced research Like many people I do a lot of research online...
    Making your Serps more useful in FFox Went to see a mate of mine yesterday and was...
    Related posts brought to you by Yet Another Related Posts Plugin.

    Tags: CSS, Google, Guides, SEO
    Posted in Uncategorized | 11 Comments »

    11 Responses to “Building a distributed link network”

    Saurav says: