1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

multi-language website - question about robots.txt & SEO

Discussion in 'White Hat SEO' started by joegodfrey86, Jun 2, 2012.

  1. joegodfrey86

    joegodfrey86 Newbie

    Joined:
    Aug 23, 2011
    Messages:
    48
    Likes Received:
    22
    Right i have a website that is in 7 languages.

    i have submitted each language to webmaster tools as a separate site targeted geographically for the country except the root which is English & worldwide.

    each site is hxxp://-------------/en/ hxxp://-------------/bg/ etc

    i was trying to block Google from viewing the directories for all the languages except the English site so that i can track each language results separately.

    i figured a separate robots.txt in each directory would work but the robots from the root domain took over for all.


    Any suggestions how to do this or another alternative?

    I think i should i have set them up as sub domains but this would require a lot of effort to change as i designed the site fully myself & there is 120+ pages - no wordpress :(
     
  2. kimomalcolmx

    kimomalcolmx Regular Member

    Joined:
    Apr 11, 2011
    Messages:
    271
    Likes Received:
    77