Why do backlink crawlers give unique user agent to websites?

Discussion in 'White Hat SEO' started by jamie3000, Jul 23, 2015.

  1. jamie3000

    jamie3000 Elite Member Premium Member

    Joined:
    Jun 30, 2014
    Messages:
    1,958
    Likes Received:
    915
    Occupation:
    Owner of BigGuestPosting.com
    Location:
    uk
    Home Page:
    Ok so this might be a dumb question but why do ahrefs + moz etc tell every website they crawl that they're a bot by giving them a unique user agent that then can be blocked? Is there some sort of legal requirement to do this or something? If one of them stopped doing it and just gave a normal IE 10 user agent wouldn't that be everyone's nice hidden PBN's busted?
     
  2. Shree

    Shree Jr. VIP Jr. VIP

    Joined:
    Jan 5, 2014
    Messages:
    512
    Likes Received:
    186
    They can use fake user agents as well. User agent is just a string a piece of software uses to identify itself. Spam bots use fake user agents all the time. Bots are also required to follow a site's robots.txt - but not all do. Many bots are coded to go just where they are not supposed to because that's where the juicy stuff is.

    I do not know if there's a legal requirement that mandates every piece of software should identify itself on a network.
     
    Last edited: Jul 23, 2015
  3. itz_styx

    itz_styx Power Member

    Joined:
    May 8, 2012
    Messages:
    684
    Likes Received:
    357
    Occupation:
    CEO / Admin / Developer
    Location:
    /dev/mem
    there are people closely watching what spider bots do. if you want to stay a "friendly" bot, you have to obey the rules, otherwise a bot will be labeled as "bad" and put on blacklists. of course ahrefs etc want to avoid that and follow the rules by identifying itself and giving webmasters the opportunity to opt-out (block it).
     
    • Thanks Thanks x 1