1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

google crawling error: losing 100 bucks a day

Discussion in 'HTML & JavaScript' started by eforestal55, Dec 28, 2009.

  1. eforestal55

    eforestal55 Registered Member

    Joined:
    Oct 21, 2009
    Messages:
    96
    Likes Received:
    1
    I hope someone can help me I'm losing cash out of my behind. Okay I haven't done any major changes to my site and sudenly google stopped crawling my site. I submitted a sitemap and this is the error I get. the problem is I don't have a robot.txt file so I don't know how they are finding it. here's the error message below:

    Network unreachable: robots.txt unreachable
    We were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely.
     
  2. showboytridin

    showboytridin Regular Member

    Joined:
    Sep 5, 2009
    Messages:
    348
    Likes Received:
    714
    Location:
    127.0.0.1
    I had that error before. After many research here is the solution I found: change host.
     
  3. cagefighter

    cagefighter Junior Member

    Joined:
    Nov 13, 2009
    Messages:
    109
    Likes Received:
    101
    Occupation:
    Retired Professional Fighter, Now Trainer and Web
    Location:
    New York/Atlanta
    So im guessing your using some sort of FTP program like ipswitch?

    And when your checking out your root folder your not seeing the robot.txt file?

    can you access your .htaccess file?

    "I submitted a sitemap" do you mean you just uploaded a sitemap.php page or did you submit with google sitemap?
     
  4. eforestal55

    eforestal55 Registered Member

    Joined:
    Oct 21, 2009
    Messages:
    96
    Likes Received:
    1
    submitted sitmap to google
     
  5. SpiderWebMaster

    SpiderWebMaster Power Member

    Joined:
    Jan 24, 2009
    Messages:
    617
    Likes Received:
    519
    Occupation:
    I don't have a job...
    Location:
    /dev/null
    this might be a stupid question or not... are you sure the permissions of your robots.txt let it be read by everybody? just to be sure try to CHMOD it 777 and take permissions gradually while checking in google webmaster tools if it's still readable. if it's not readable with 777 permissions then i would say that's the strangest thing i've ever heard of.

    If it's a dumb suggestion that i'm making please just ignore it.

    :D