yowies
Regular Member
- Apr 8, 2008
- 473
- 147
I have searched through the forum for a rough and ready answer to this query.I have read that throwing up a large site(1,000 pages+?) instantly, signals Google that there might be something fishy.
But unless you have submitted a sitemap, G wouldnt know there was how many pages there were.
It could only tell the number of pages after spidering/indexing.
1/So...what is the largest site that you could throw up without attracting attention using a sitemap(I find it better for getting pages indexed).?
2/What empirical methods/results has anyone ever seen for instant/large sites being sandboxed/deindexed/penalised etc..quicker than a smaller site.?
Thanks
But unless you have submitted a sitemap, G wouldnt know there was how many pages there were.
It could only tell the number of pages after spidering/indexing.
1/So...what is the largest site that you could throw up without attracting attention using a sitemap(I find it better for getting pages indexed).?
2/What empirical methods/results has anyone ever seen for instant/large sites being sandboxed/deindexed/penalised etc..quicker than a smaller site.?
Thanks