Need a robot.txt file made for site

roomoo22

Registered Member
Joined
Sep 15, 2008
Messages
72
Reaction score
2
Hello,

Anyone good at making these files to specifications?

Please let me know and quote me a price.

Thanks,
roomoo22
 
Save your money dude, robots files are not that important. Might be if you have some folder you don't want indexed, but if not then you can just move on
 
Code:
http://www.robotstxt.org/

or

Code:
http://petercoughlin.com/robotstxt-wordpress-plugin/

or

google webmaster tools



(not necessarily in the order)
 
Hi all,

It is a bit more complex than that. Need to be able to disallow Googlebot from accessing pages (like add to cart links, etc.) on the shop so that I can get on Product Search and stay there. Be invisible in plain sight, so to speak. There are several other shops doing it so I know it can be done--just need someone technical that is completely devoted to achieving it for both of us. Money is AWESOME when I am seen and selling. And I have always shared my profits 50/50 with whoever is my tech support. I am definitely not technically inclined. Am the customer and suppliers support and own the business. But have no business unless I have superb tech support behind me.

Anyone interested?

Thanks!!!
 
you can use sample like this,

Code:
User-agent: *
Allow: /
Disallow: /cgi-bin
Disallow: /wp-admin
Disallow: /wp-includes
Disallow: /wp-content/plugins
Disallow: /wp-content/cache
Disallow: /wp-content/themes
Disallow: /feed
Disallow: /*/feed
Disallow: /comments
Disallow: /2010/*
Disallow: /2011/*
Disallow: /2012/*
Disallow: /iframes
Disallow: */trackbackUser-agent: Googlebot
Disallow: /*.js$
Disallow: /*.inc$
Disallow: /*.css$
Disallow: /*.gz$
Disallow: /*.wmv$
Disallow: /*.cgi$
Disallow: /*.xhtml$
Disallow: /*.xlsx $
Disallow: /*.doc$
Disallow: /*.pdf$
Disallow: /*.zip$User-agent: *
Allow: /images

User-agent: Mediapartners-Google
Allow: /

User-agent: Adsbot-Google
Allow: /

User-agent: Googlebot-Image
Allow: /

User-agent: Googlebot-Mobile
Allow: /

#User-agent: ia_archiver-web.archive.org
#Disallow: /

Sitemap: http://www.your-blog-name-here.com/sitemap.xml
 
That's great but like I said I am just the customer/supply side. Don't even know how to set up the websites or anything technical. Have a couple of guys working on it but they have come up with nothing so far.

So I came on here looking for anyone who has the time and inclination to figure out how the other sites are staying on Product Search and I will pay well for anything that works. Can send screenshots of money made in past if that gives incentive. From Jan - Feb of 2010, we made $28K and then got booted off. Haven't been able to get back on since. Been living off my share of that all this time, so I can vouch the money is good.

Thanks for all the help and suggestions but I really need someone on board who can do all this stuff.
 
That's great but like I said I am just the customer/supply side. Don't even know how to set up the websites or anything technical. Have a couple of guys working on it but they have come up with nothing so far.

So I came on here looking for anyone who has the time and inclination to figure out how the other sites are staying on Product Search and I will pay well for anything that works. Can send screenshots of money made in past if that gives incentive. From Jan - Feb of 2010, we made $28K and then got booted off. Haven't been able to get back on since. Been living off my share of that all this time, so I can vouch the money is good.

Thanks for all the help and suggestions but I really need someone on board who can do all this stuff.
 
Hi all,

It is a bit more complex than that. Need to be able to disallow Googlebot from accessing pages (like add to cart links, etc.) on the shop so that I can get on Product Search and stay there. Be invisible in plain sight, so to speak. There are several other shops doing it so I know it can be done--just need someone technical that is completely devoted to achieving it for both of us.
...
You might be better off by just using the robots meta tag on the individual pages you want. If the pages are generated by a script then include it in the script that writes the particular pages you want to disallow.

The guys your using to write or update your scripts should be able to do this easily enough. Just include this line in the head section of the pages you want to disallow.
Code:
<meta name="robots" content="noindex, nofollow">

If you want to take one more step then also add no follow to any links leading to those pages as well.
Code:
<a rel="nofollow" href="http://www.example.com/">Anchor Text</a>
That should take care of your problem without having to create some complex criteria in your robots.txt file to deal with all the variable names that your script might be creating.

If you don't already have one though you should go ahead and add at least a blank robots.txt file to your root just to stop getting a 404 error everytime a spider or crawler requests it.
 
Okay, thanks Grey Wolf. Will give it to them and see what they can do.

I appreciate it!
 
Back
Top