Discussion in 'Black Hat SEO' started by mathdoc, Jun 4, 2010.
What's the best way to get 20,000 pages of a new website indexed?
bookmark all pages. blast xrumer. submit articles on article dir sites.
And that wouldn´t take like ... 100 years?
Hi are you willing to pay ?
I can get ur work done
how much to index a 57,000 page site
Why don't you just ping them all?
Submit a sitemap to Analytics?
Try to get your main-but-sub categorial pages that links to all other pages indexed, which will crawl through its extensions.
blog comments, easy and safe
Would pinging and submitting a sitemap actually work? Anybody know of a free sitemap creator that can handle 20,000 pages?
How many blog comments per page? What software do you recommend to do this?
You'd have to give me more information about your service before I can say whether I'd pay.
What CMS are you using? They all have free sitemap generators, and volume is not an issue, they are just adding some xml to a sql query result.
I'm not using a CMS.
Then what are you using? Surely you didnt create a 20k page html site?
Well, yes and no. I do have over 20K html pages with no CMS. But I didn't manually create them.
I am wondering how to get my large list of profiles indexed. I have about 60k of profiles that need to be indexed, but I have no idea how to get it done on such a large level.
I don't think you want to submit a sitemap with 20k pages. If there was a hierarchy in your pages, I would submit a few of the top level pages at first and then let Gbot find the rest. If a couple of weeks go by and Gbot hasn't found all your top level pages then I would add the rest of them to the sitemap.
If there is not a clear hierarchy but the pages are linked together somehow, then I would just do a semirandom sampling for the sitemap, up to maybe 200-300 pages. Too big of a sitemap will get you into trouble and if the pages are navigable through each other you'll be ok submitting just a few.
Separate names with a comma.