1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Why do backlink crawlers give unique user agent to websites?

Discussion in 'White Hat SEO' started by jamie3000, Jul 23, 2015.

  1. jamie3000

    jamie3000 Supreme Member

    Joined:
    Jun 30, 2014
    Messages:
    1,409
    Likes Received:
    652
    Occupation:
    Finance coder looking for semi-retirement
    Location:
    uk
    Ok so this might be a dumb question but why do ahrefs + moz etc tell every website they crawl that they're a bot by giving them a unique user agent that then can be blocked? Is there some sort of legal requirement to do this or something? If one of them stopped doing it and just gave a normal IE 10 user agent wouldn't that be everyone's nice hidden PBN's busted?
     
  2. Shree

    Shree Jr. VIP Jr. VIP

    Joined:
    Jan 5, 2014
    Messages:
    508
    Likes Received:
    178
    They can use fake user agents as well. User agent is just a string a piece of software uses to identify itself. Spam bots use fake user agents all the time. Bots are also required to follow a site's robots.txt - but not all do. Many bots are coded to go just where they are not supposed to because that's where the juicy stuff is.

    I do not know if there's a legal requirement that mandates every piece of software should identify itself on a network.
     
    Last edited: Jul 23, 2015
  3. itz_styx

    itz_styx Power Member

    Joined:
    May 8, 2012
    Messages:
    669
    Likes Received:
    343
    Occupation:
    CEO / Admin / Developer
    Location:
    /dev/mem
    Home Page:
    there are people closely watching what spider bots do. if you want to stay a "friendly" bot, you have to obey the rules, otherwise a bot will be labeled as "bad" and put on blacklists. of course ahrefs etc want to avoid that and follow the rules by identifying itself and giving webmasters the opportunity to opt-out (block it).
     
    • Thanks Thanks x 1