1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Need a robot.txt file made for site

Discussion in 'Hire a Freelancer' started by roomoo22, Sep 9, 2010.

  1. roomoo22

    roomoo22 Registered Member

    Joined:
    Sep 15, 2008
    Messages:
    72
    Likes Received:
    2
    Hello,

    Anyone good at making these files to specifications?

    Please let me know and quote me a price.

    Thanks,
    roomoo22
     
  2. bzy39

    bzy39 Regular Member

    Joined:
    Jan 15, 2009
    Messages:
    434
    Likes Received:
    239
    just use google webmaster
     
  3. nufaman

    nufaman Elite Member

    Joined:
    May 29, 2009
    Messages:
    1,697
    Likes Received:
    1,185
    Save your money dude, robots files are not that important. Might be if you have some folder you don't want indexed, but if not then you can just move on
     
  4. Fwiffo

    Fwiffo Power Member

    Joined:
    Apr 7, 2010
    Messages:
    562
    Likes Received:
    325
    Occupation:
    Starship Captain
    Location:
    Pluto / Spathiwa
    Code:
    http://www.robotstxt.org/
    or

    Code:
    http://petercoughlin.com/robotstxt-wordpress-plugin/
    or

    google webmaster tools



    (not necessarily in the order)
     
    • Thanks Thanks x 1
  5. roomoo22

    roomoo22 Registered Member

    Joined:
    Sep 15, 2008
    Messages:
    72
    Likes Received:
    2
    Hi all,

    It is a bit more complex than that. Need to be able to disallow Googlebot from accessing pages (like add to cart links, etc.) on the shop so that I can get on Product Search and stay there. Be invisible in plain sight, so to speak. There are several other shops doing it so I know it can be done--just need someone technical that is completely devoted to achieving it for both of us. Money is AWESOME when I am seen and selling. And I have always shared my profits 50/50 with whoever is my tech support. I am definitely not technically inclined. Am the customer and suppliers support and own the business. But have no business unless I have superb tech support behind me.

    Anyone interested?

    Thanks!!!
     
  6. bzy39

    bzy39 Regular Member

    Joined:
    Jan 15, 2009
    Messages:
    434
    Likes Received:
    239
    you can use sample like this,

    Code:
    User-agent: *
    Allow: /
    Disallow: /cgi-bin
    Disallow: /wp-admin
    Disallow: /wp-includes
    Disallow: /wp-content/plugins
    Disallow: /wp-content/cache
    Disallow: /wp-content/themes
    Disallow: /feed
    Disallow: /*/feed
    Disallow: /comments
    Disallow: /2010/*
    Disallow: /2011/*
    Disallow: /2012/*
    Disallow: /iframes
    Disallow: */trackbackUser-agent: Googlebot
    Disallow: /*.js$
    Disallow: /*.inc$
    Disallow: /*.css$
    Disallow: /*.gz$
    Disallow: /*.wmv$
    Disallow: /*.cgi$
    Disallow: /*.xhtml$
    Disallow: /*.xlsx $
    Disallow: /*.doc$
    Disallow: /*.pdf$
    Disallow: /*.zip$User-agent: *
    Allow: /images
    
    User-agent: Mediapartners-Google
    Allow: /
    
    User-agent: Adsbot-Google
    Allow: /
    
    User-agent: Googlebot-Image
    Allow: /
    
    User-agent: Googlebot-Mobile
    Allow: /
    
    #User-agent: ia_archiver-web.archive.org
    #Disallow: /
    
    Sitemap: http://www.your-blog-name-here.com/sitemap.xml
     
    • Thanks Thanks x 1
  7. roomoo22

    roomoo22 Registered Member

    Joined:
    Sep 15, 2008
    Messages:
    72
    Likes Received:
    2
    That's great but like I said I am just the customer/supply side. Don't even know how to set up the websites or anything technical. Have a couple of guys working on it but they have come up with nothing so far.

    So I came on here looking for anyone who has the time and inclination to figure out how the other sites are staying on Product Search and I will pay well for anything that works. Can send screenshots of money made in past if that gives incentive. From Jan - Feb of 2010, we made $28K and then got booted off. Haven't been able to get back on since. Been living off my share of that all this time, so I can vouch the money is good.

    Thanks for all the help and suggestions but I really need someone on board who can do all this stuff.
     
  8. roomoo22

    roomoo22 Registered Member

    Joined:
    Sep 15, 2008
    Messages:
    72
    Likes Received:
    2
    That's great but like I said I am just the customer/supply side. Don't even know how to set up the websites or anything technical. Have a couple of guys working on it but they have come up with nothing so far.

    So I came on here looking for anyone who has the time and inclination to figure out how the other sites are staying on Product Search and I will pay well for anything that works. Can send screenshots of money made in past if that gives incentive. From Jan - Feb of 2010, we made $28K and then got booted off. Haven't been able to get back on since. Been living off my share of that all this time, so I can vouch the money is good.

    Thanks for all the help and suggestions but I really need someone on board who can do all this stuff.
     
  9. GreyWolf

    GreyWolf Executive VIP Jr. VIP

    Joined:
    Aug 17, 2009
    Messages:
    1,930
    Likes Received:
    5,387
    Gender:
    Male
    Occupation:
    Artist / Craftsman
    Location:
    sitting at my PC
    You might be better off by just using the robots meta tag on the individual pages you want. If the pages are generated by a script then include it in the script that writes the particular pages you want to disallow.

    The guys your using to write or update your scripts should be able to do this easily enough. Just include this line in the head section of the pages you want to disallow.
    Code:
    <meta name="robots" content="noindex, nofollow">
    If you want to take one more step then also add no follow to any links leading to those pages as well.
    Code:
    <a rel="nofollow" href="http://www.example.com/">Anchor Text</a>
    That should take care of your problem without having to create some complex criteria in your robots.txt file to deal with all the variable names that your script might be creating.

    If you don't already have one though you should go ahead and add at least a blank robots.txt file to your root just to stop getting a 404 error everytime a spider or crawler requests it.
     
    • Thanks Thanks x 1
  10. roomoo22

    roomoo22 Registered Member

    Joined:
    Sep 15, 2008
    Messages:
    72
    Likes Received:
    2
    Okay, thanks Grey Wolf. Will give it to them and see what they can do.

    I appreciate it!
     
  11. risefromdeath

    risefromdeath Power Member

    Joined:
    Jul 1, 2009
    Messages:
    650
    Likes Received:
    107
    Please pm me with details and everything i will get it done!!
    Thanks