1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

About using canonical and robots.txt block

Discussion in 'White Hat SEO' started by dontgo, Jun 9, 2017.

  1. dontgo

    dontgo Newbie

    Jul 29, 2016
    Likes Received:
    Hi, again, gurus

    This time I want to ask when or under what circumstances should I use canonical tags for duplicated pages, and when to use robots.txt to block them?

    as an example:
    Assuming I have two versions of urls
    A : /shop/page***
    B : /sale/page***

    The pages have almost exact content and A is the main version I want.
    So should I use "rel=canonical" to point B to A; or should I just simply block /sale/ in robots.txt?

    Don't know if I express clearly enough, so forgive me for that, plz.

    Thanks in advance