1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

I lost 80 pages from Googlr

Discussion in 'White Hat SEO' started by bleach, Feb 17, 2009.

  1. bleach

    bleach Senior Member

    Joined:
    Oct 12, 2008
    Messages:
    934
    Likes Received:
    82
    Location:
    New York
    hey,i want to ask.
    my site has 3000 index pages but i lose around 80 pages.
    Now i got around 2900 index pages.
    is it okay?
     
  2. mightybh

    mightybh Jr. VIP Jr. VIP Premium Member

    Joined:
    Feb 27, 2008
    Messages:
    1,029
    Likes Received:
    1,714
    Occupation:
    CEO
    Location:
    UK
    You really need to find better things to do lol

    Its normal. This number fluctuates all the time. Don't pay any attention to it unless you get completely de-indexed.
     
    • Thanks Thanks x 1
  3. bleach

    bleach Senior Member

    Joined:
    Oct 12, 2008
    Messages:
    934
    Likes Received:
    82
    Location:
    New York
    thanks anyway :D
     
  4. secretboy08

    secretboy08 Jr. VIP Jr. VIP

    Joined:
    Apr 2, 2008
    Messages:
    2,369
    Likes Received:
    877
    Occupation:
    Internet marketer
    Location:
    Global Citizen
  5. ipopbb

    ipopbb Power Member

    Joined:
    Feb 24, 2008
    Messages:
    626
    Likes Received:
    844
    Occupation:
    SEO & Innovative Programming
    Location:
    Seattle
    Home Page:
    Make sure you site is working fine... One time I introduced a bug in my web app that fubared navigation for bots. Saw a slow but steady decline in indexed pages. It was a huge site, but learned a lot from it... One lesson is that it takes 24 hours to get indexed but it can take weeks for those URLs to be updated in the SERPS and it can take months for a site to organically go away. UP & Down a little is normal. Down slowly and consistently over time is a sign the googlebot is having trouble with your site.

    Another issue is that if you introduced anything that makes your page load times slower then googlebot will likely crawl less of your site with each visit and that could cause a "one time" noticeable drop in indexed pages and then it will stabilize when it finds equilibrium again.

    If you are creating a large app in PHP,JSP,ASP, etc... benchmark everything and stay up to date on "performance tuning" for your language of choice. Using careful development I was able to get about 250K pages indexed in 4 weeks, but you have to really measure everything and and tune away the milliseconds to get googlebot to believe that sky is the limit on crawl speed... after all... it wants to be polite and not be a denial of service attack. It knows your page load times, but do you?

    Hope that is helpful.
     
    • Thanks Thanks x 1
  6. ipopbb

    ipopbb Power Member

    Joined:
    Feb 24, 2008
    Messages:
    626
    Likes Received:
    844
    Occupation:
    SEO & Innovative Programming
    Location:
    Seattle
    Home Page:
    One more thought... if it isn't too "black hat" register the site in Google's Webmaster Tools... It provides the googlebot error output if the bot is encountering errors or slow page load times.