1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Importance of Google Index Footprint

Discussion in 'White Hat SEO' started by kakakhase, Mar 22, 2011.

  1. kakakhase

    kakakhase Newbie

    Joined:
    Mar 22, 2011
    Messages:
    4
    Likes Received:
    0
    Hi,

    The Google Index Footprint of my site has been dropping for last 3 months, then this month it went back up again. Any thoughts why?

    I would like to ask if the Google Index Footprint is still important and relevant in Today's SEO world. ie. type site:url into Google and the footprint is the no. of your website's pages being captured in Google. What are your thoughts and advice?

    Thank you.

    A Newbie :44:
     
  2. Autumn

    Autumn Elite Member

    Joined:
    Nov 18, 2010
    Messages:
    2,197
    Likes Received:
    3,041
    Occupation:
    I figure out ways to make money online and then au
    Location:
    Spamville
    If your number of pages indexed is continually going down then you definitely have some problems - it could be dupe content, shitty content, poor internal linking or just not enough external backlinks to your content pages. Obviously a page that's not indexed can't get traffic, so you generally want as many good content pages indexed as you can.

    FYI normally when people talk about "footprints" they are talking about easily identifiable factors between sites that make it easy to identify sites as belonging to same network or owner, or that the sites have been generated using the same scripts etc. Not many people use the word "footprint" when referring to the number of pages of a site that are indexed.
     
  3. kakakhase

    kakakhase Newbie

    Joined:
    Mar 22, 2011
    Messages:
    4
    Likes Received:
    0
    Thank you Autumn!!:cool:

    Yes, understand what you mean about using the word "footprint". It will lead to misunderstandings especially with Google's new algo of cleaning up duplicate footprints. Thank you for your thoughts.

    Mine is a legit directory site, but there might be duplicated content... ie. Page A is a brief writeup, Page B is a more detailed writeup of Page A with 30% similar content.

    Can I ask another Newbie qn :44:, how does the G check for duplicate content? Same paragraph repeating? Or the old way our essays are checked in schools, 7 same words together and you are busted!! :p Thank you!
     
  4. kakakhase

    kakakhase Newbie

    Joined:
    Mar 22, 2011
    Messages:
    4
    Likes Received:
    0
    Now reading this lovely forum on duplicate content, very interesting!!
    Any hard rules or percentage on how many similar words is considered duplicate content?

    Thank you again Seniors!!
     
    Last edited: Mar 22, 2011
  5. Autumn

    Autumn Elite Member

    Joined:
    Nov 18, 2010
    Messages:
    2,197
    Likes Received:
    3,041
    Occupation:
    I figure out ways to make money online and then au
    Location:
    Spamville
    It's quite a common situation for hierarchical directory style sites to drop the really deep pages if they don't have enough backlinks. Even though your page Bs have more content, G will be dropping them because the As get more link juice as it trickles down from the home page.

    I would strongly consider getting rid of your page Bs and consolidating the content into your As. 301 redirect the Bs to the relevant As so any link juice they might have gets transfered to the new and improved A. Decent backlinks to your content pages (not just your home page) are also essential.

    Consider making a sitemap to help G spider your whole site, and make sure you've got thorough internal linking so every page gets found and gets a share of the link juice.

    Dupe content isn't the death sentence that some people make it out to be but it doesn't help matters either. Avoiding dupe content really starts off with your site design - you should try and avoid it right from the start. All things considered, a lower number of unique pages can be beneficial. For example, if you go out and get 100 backlinks, you will get more bang for your buck spreading them over 10 pages than 100.

    No one really knows how G measures or detects dupe content and you will get numbers from 10-80% being bandied around. However with enough backlinks you can still get good traffic to the shittiest scraped or mashed up content. As for actual detection, it's probably something like word n-grams but no one knows for sure.
     
    • Thanks Thanks x 1
  6. kakakhase

    kakakhase Newbie

    Joined:
    Mar 22, 2011
    Messages:
    4
    Likes Received:
    0
    Thank you Autumn!! Have just thanked you.

    BTW, your profile pic is nice and very alluring!