1. This website uses cookies to improve service and provide a tailored user experience. By using this site, you agree to this use. See our Cookie Policy.
    Dismiss Notice

Massive Link Farm...should I use robots.txt to prevent google?

Discussion in 'Black Hat SEO' started by turokman555, Jan 21, 2010.

  1. turokman555

    turokman555 Newbie

    Jan 8, 2010
    Likes Received:

    I have a new site that I have a resources page full of links. I wrote some special software that is automatically exchanging links with different IP addressed servers.

    Suddenly, I have 1000's of links pointing to me , (and from me) in just a few days. Obviously the kinks need to be worked out, but do you think I can use robots.txt to block my outgoing links page to reduce the risk of google thinking my resources page is part of a link farm?
  2. seorebel

    seorebel Junior Member

    Aug 15, 2008
    Likes Received:
    Nope.. robots.txt is only to be seen as guidance for Google.. and it can take some time for them to adjust the indexing of sites based on your robots.txt.

    How about doing some cloaking instead, and release x amount of links per day..