1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Blocking Googlebot from images folder

Discussion in 'White Hat SEO' started by droona, Dec 2, 2014.

  1. droona

    droona Newbie

    Joined:
    Apr 5, 2011
    Messages:
    37
    Likes Received:
    10
    Hello,

    I would like some advice if the experts find the following setup will raise flags or will affect ranking.

    I'm about to publish 6 sites, in the same niche, with this setup:

    - Somewhat different HTML / Template (basic same template, HTML moved around, changed, etc)
    - Different CSS
    - Similar JS (jquery, etc)
    - Different top Image Header
    - Different Content
    - Similar sections
    - Identical Images
    - Different Image filenames
    - Different Internal URLs
    - Different Title, Meta Description tags
    - No Meta Keywords tags (is this still necessary?)
    - Different ALT, TITLE Img Tags

    The only flag I see is using the same images for the 6 sites, I was either thinking saving the images at a different compression which technically will create "different" image files. But I use JPG, GIF and PNG, so that would only work for the JPG, I still need to go and change the GIFs and PNGs and we are talking around 200 images, that would mean creating 1,200 images, tons of work. Plus I like very much the ones I created.

    Will this fall int the "cookie cutter" approach Google frawns upon? The sites are competing within each other, offering similar services at different prices and terms. I don't see why this is wrong.

    I plan to basically block the images folder in the robots.txt file.

    Please let me know if you think this is a "no-no" or this is basically fine. The reason could be I don't want my images to be able to be searched in Google Images.

    Many thanks!
     
  2. silentbauer

    silentbauer Newbie

    Joined:
    Oct 19, 2011
    Messages:
    15
    Likes Received:
    6
    Location:
    LA
    Not sure with images, but I will say that I had a site get blasted in the last Panda update for blocking all CSS & JS (basically all page elements). Google wants you to allow them to crawl everything so they can make sure there's no malicious code, overly aggressive advertisements above the fold, etc. Once I unblocked, I eventually saw a rebound. Not sure with images specifically but I'm of the mindset to just let them see everything IF you're not blackhatting any scripts, then keep it blocked.

    It also depends how many duplicate images you're talking and are they in the same spot. People use identical images ALL the time (that's why there are stock image sites). If your sites are different enough, just having the same images probably isn't a huge deal (especially since you have different content). To be safe I would say leave unblocked.