1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

What is the best way to prevent google from crawling / indexing certain pages?

Discussion in 'Black Hat SEO' started by tonyp, Aug 5, 2014.

  1. tonyp

    tonyp Regular Member

    Joined:
    May 23, 2010
    Messages:
    278
    Likes Received:
    31
    Gender:
    Male
    Occupation:
    Digital Marketing
    Location:
    Los Angeles
    Home Page:
    Wondering what is the best way to block pages... was thinking of using AJAX but I know google is able to cralw / index some AJAX.

    maybe sending an X-robots no index reply for those JS files?
     
  2. Apricot

    Apricot Administrator Staff Member Moderator

    Joined:
    Mar 26, 2013
    Messages:
    11,967
    Likes Received:
    6,446
    Gender:
    Female
    Occupation:
    BHW Moderator
    Location:
    London
    Home Page:
  3. satyawrat

    satyawrat Jr. VIP Jr. VIP

    Joined:
    Jul 8, 2009
    Messages:
    924
    Likes Received:
    1,182
    Occupation:
    Hustler
    Location:
    Gurgaon
    Home Page:
    robots.txt file? or what Mr Apricot suggested above.
     
  4. tonyp

    tonyp Regular Member

    Joined:
    May 23, 2010
    Messages:
    278
    Likes Received:
    31
    Gender:
    Male
    Occupation:
    Digital Marketing
    Location:
    Los Angeles
    Home Page:
    but if the pages are called with AJAX? they don't have a physical URL I can block via meta tags or robots.txt
     
  5. SEO Power

    SEO Power Elite Member

    Joined:
    Jul 14, 2014
    Messages:
    2,637
    Likes Received:
    680
    Occupation:
    Self employed
    Location:
    Houston, TX
    Then create physical URLs and stop using AJAX. Besides, if they don't have actual URLs, I don't think Google will be able to find and index them. Problem solved.
     
  6. tonyp

    tonyp Regular Member

    Joined:
    May 23, 2010
    Messages:
    278
    Likes Received:
    31
    Gender:
    Male
    Occupation:
    Digital Marketing
    Location:
    Los Angeles
    Home Page:
    AJAX is needed for this specific site.
    According to this http://www.klikki.com/blog/google-able-index-content-fetched-using-ajax Google can index pages which are fetched via ajax (Google can process the relevant JS)

    Sending X-robots NOINDEX on the JS files makes sense?