1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Implications of crawl delay

Discussion in 'White Hat SEO' started by Wardy, Aug 22, 2016.

  1. Wardy

    Wardy Newbie

    Joined:
    Aug 22, 2016
    Messages:
    1
    Likes Received:
    0
    Hi, I am planning on implementing crawl delay on my site of 2 in to the robots.txt and registering with search console, as I am seeing a number of crawl errors related to timeouts.

    Are there any implications on SEO by doing this?

    Is there anything else I need to be aware of?

    Thanks.
     
  2. validseo

    validseo Jr. VIP Jr. VIP Premium Member

    Joined:
    Jul 17, 2013
    Messages:
    910
    Likes Received:
    527
    Occupation:
    Professional SEO
    Location:
    Seattle, Wa
    Google and others don't follow that directive. If you want to slow googlebot down you have to do it from inside GSC.

    If your website is timing out from bot activity you need to spend the next several weeks performance tuning your website. If your average page times are generally very fast but you are seeing periodic bursts of 500 errors in the GSC crawl errors then you might be getting a slow loris Negative SEO attack: http://www.seoclarity.net/the-secret-world-of-negative-seo-13244/

    BingBot is a very inefficient bot... I HIGHLY recommend dialing back bing's crawl rate and setting it to crawl in off hours in Bing Webmaster Tools.

    A large number of bots are for SEO platform tools... block the shit out of those... thats just intel for your competitors... the SEO bots never deliver customers...