Discussion in 'Black Hat SEO' started by J0kerz, Apr 13, 2010.
In your opinion, what is the best way to make Google crawl your website as often as possible?
Constantly changing content is very important.
Yup, just make a blog.
By having a huge amount of links pointing at it. The more links to your site there are the more times GoogleBot will follow those links and crawl your site.
In a nutshell, it is all about information and fresh, related content.
Ebay and Facebook are the best two examples of this. Google crawls them every hour, due to their constant updates of user posts and content.
Ebay - everytime someone bids or posts on somthing, that is new content.
Facebook - everytime someone posts or changes something, that is new content.
Make your site SEO efficient and then post as much content as possible all the time. The more frequently you update, with relevant and keyword related content, you will be crawled more often.
The proper term for having people post on your site, like in this forum for example, is called ' user-generated content '.
Hope this helps.
As everyone else has suggested - fresh content.
But that doesn't mean that you need to add 100s of articles a day. You can do that, but it isn't necessary. It doesn't even need to be consistent to get crawled a lot. I have an entertainment news blog and I might add a few articles a week, and then another week I might add a few articles a day, sometimes I might even let it slide for a couple weeks while I'm involved in other projects. But when I check the logs, the googlbots are there multiple times a day, everyday. So are the other se spiders and crawlers.
The cool thing about it is that when I start a new site I will either put up an ad link in the sidebar like its a paid advertisement, or if I can spin it to fit within the content or my site I will post an article something like hey we just came across this great new website today that is about... Whenever I do that the new site will always be indexed within hours of making the post. I don't even bother submitting the new site directly to the search engines because it happens faster doing it this way. It also seems to get the other sites crawled more often as well, even if they are static content pages. I guess its because they are crawling the blog site and finding new content most of the time, they just follow the link to the other sites as well.
What I want to do next though for another project is to have the user-generated content like what aftershock mentioned. I just need to decide how to do it, and make sure it doesn't become flypaper for spam advertisers. (lol, I know that is most of us in here, but we only want to do it on others blogs, we don't want it done to ours. lol)
But the bottom line is that if you have new content posted at least fairly often, even just one new page, then the crawlers will keep coming to see if you've added it yet.
edit - I just looked back at what aftershock said, and he seems to think the more often you add content the better. I think I'd definately take his advice over mine, but I think when he says as much as possible it doesn't mean 100s a day. A few good relevant articles a day should do you better than a bunch of autogenerated crap articles. the entertainment news niche doesn't interest me so I'm pretty inconsistent with it, but even so I'm apparently adding enough often enough to keep it crawled every day. Now keep in mind I'm only talking about getting crawled, you will need to be a lot more consitent if you want to get a lot of returning traffic, or to help with page rank, etc.
Thanks for your replies guys!
DO you think the Domain Age have an influence on how often Google crawl your site?
Let say your site have been indexed for the last ten years.
I personally don't think that the Domain Age has any influence on how often Google crawls your site. For as long as it indexed it should not matter. And I 100% agree with all prev posts: incoming links and changing of contents are two best ways to make G bots crawl your site.
I use a lot of aged domains and I would say that they will nearly always get indexed quicker than a new one
huohuo, Get the f*ck out
Nope, it doesn't have anything to do with how often your site gets crawled. However, it does matter in how your site is indexed into the SE and the stable foothold it places in keeping you indexed when you screw up with garbage articles or content.
The content and rank of your site is what determines how often you get crawled. The content due to it being ' content ' , which google recognizes as ' information '. The rank due to it being based on the ' authority ' aspect.
It has nothing to do with the age factor, as it used to...proof of that is all of the old domains out there would be stomping every new site into the ground if it weighed a substancial factor, which it is their authority and page ranking that gives them pull, not their age. Anyone can buy a domain today and have it linked up with as many quality backlinks, posts and authority links as a site that has been around for 20 years and take their seat for their keyword that has been around just as long.
hope that helps clear up the issue.
The reason for that is because they have been indexed before, which ear-marks them as an ' established domain 'in the system. They will get indexed no differently than a brand new domain though, as it is the content that does it. It has been my experience with older domains that you are more liable to be de-indexed due to an old existing link from a previous owner coming back to bite yo uin the arse.
Beware of this, as it can sometimes do more damage than good. Not saying that every domain is this way, by any means, just noting that I'd rather take a fresh property and build it up fron the foundation rather than do a remodel and find everything needs to be replaced or repaired.
make an internal blog to your site
Submit the sitemap of that blog to google
add RSS feed of that blog to RSS aggregators
change the writing setting and add a huge ping list (use google if you don't know what this is)
Social Bookmark your blog
Build tons of deep links
LINK TO HIGH PR SITES EVEN IF THEY DO NOT LINK BACK TO YOU
Constant change your site content , for huge traffic any one can submit in SBS and
Ariticle Directory than google crwled your site very quickly
Building tons of deep links off the bat doesn't look natural.... its usually best to just submit a sitemap to google. I mean, it takes longer but doesn't do the damage that having 1000s of links overnight does. I mean, unless you are constantly deep linking 1000s of links per day.
I have a site, the content doesn't really change a lot. But I do have a lot of links pointing to it. It has been getting crawled every 5 of 6 days. I suppose that is one way to get the bot to crawl your site - with links.
Separate names with a comma.