1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Should I block this on Robots.txt ?

Discussion in 'White Hat SEO' started by DO067, Oct 26, 2016.

  1. DO067

    DO067 Registered Member

    Joined:
    May 4, 2016
    Messages:
    60
    Likes Received:
    2
    Hi there,

    I've started an audit of the new version of a site for one of my customers and the tool reports lot of errors for such pages http://www.mywebsite.fr/advancedmedia/index/ajax?id=124/

    The errors are mainly duplicated title or empty meta, which is no so important for such URL in my opinion.

    Do do you think I should block those URL via the robots.txt file ?

    Thanks
     
  2. altec

    altec Regular Member Premium Member

    Joined:
    May 31, 2013
    Messages:
    328
    Likes Received:
    55
    Why don't you just remove them if they are useless?
     
  3. DO067

    DO067 Registered Member

    Joined:
    May 4, 2016
    Messages:
    60
    Likes Received:
    2
    I don't really know if there are useless. Any idea to check that ?
    FYI I am only responsible of the SEO for the site, and I don't have contact with the agency that made the new site.
     
  4. growsocialworld

    growsocialworld Junior Member

    Joined:
    Jun 2, 2012
    Messages:
    179
    Likes Received:
    29
    Gender:
    Male
    Location:
    Still Searching
    You just block the bots from crawling...
     
  5. DO067

    DO067 Registered Member

    Joined:
    May 4, 2016
    Messages:
    60
    Likes Received:
    2
    That's why you suggest or it is irony ?
     
  6. growsocialworld

    growsocialworld Junior Member

    Joined:
    Jun 2, 2012
    Messages:
    179
    Likes Received:
    29
    Gender:
    Male
    Location:
    Still Searching
    You can block it....
     
  7. webspero

    webspero Newbie

    Joined:
    Sep 6, 2016
    Messages:
    33
    Likes Received:
    1
    Gender:
    Male
    Occupation:
    Development
    Location:
    Chandigarh, India
    Home Page:
    If those pages are of no use then you can remove those pages if those are of use then you can restrict the crawler using robots.txt file.
     
  8. Juneja

    Juneja Supreme Member

    Joined:
    Jun 12, 2016
    Messages:
    1,441
    Likes Received:
    191
    Gender:
    Male
    Remove them if they are of no use.
     
  9. DO067

    DO067 Registered Member

    Joined:
    May 4, 2016
    Messages:
    60
    Likes Received:
    2
    Thanks all
     
  10. Neoterz

    Neoterz Jr. VIP Jr. VIP

    Joined:
    Jun 25, 2015
    Messages:
    1,134
    Likes Received:
    195
    Gender:
    Male
    Occupation:
    SEO - Freelancer
    Location:
    At where you in
    I think your link is not an SEO friendly one. Coz, it has the reserved characters like ?and =. It may cause the problem with the links. So better to change the link structure with SEO friendly.