Discussion in 'Black Hat SEO' started by genotip, Jun 4, 2011.
can you shere a bit how did you indexed all those pages? how much time it took for you? and so on...
I am also waiting on replies, want to know how they get in indexed.
Wondering, did you take a look at your own signature?
i was have one site with ~990.000 pages ; > Usign analytics and webmaster tools (added sitemap 50k links/each sitemap)
on few years old main domain, and new subdomain usign free server. Main domain was have 1.800.000 pages indexed total.
After 2 weeks i was have ~50.000 pages indexed by google and traffic 1600u/daily ... but administrators was deleted this site... they have right ofcourse;]
All of these pages was have few unique words only. I have some experience with creating websites with huge amount of pages ; > If you want know anything just ask
omg what kind of sites have 990.000 pages?
a spam websites ; > just put some trash and it will give you some result... but for this best way is v.good programming skills (php,mysql,perl) and ofcourse html,css ;>
did you do something that was speeding up the indexing? or just webmaster tools?
pinging is great to index your backlinks, pinging 200,000 pages of the same site is not such a great idea.
use interlinked sitemaps
the strage think is that i have 20 subdomains, one of them have great indexing speed - all the others are very very slow.
there have same code, same design, same sitemaps - everything the same.
but still - 1 subdomain indexed already 100K pages, all the rest have only few hundreds.
and i don't understand WHY
definitely some sort of sitemap indexing structure.
I mean building backlinks to the index page of a sitemap directory, will get the search engine spiders crawling through the sitemap directory finding the links of your webpages.
But you will have to construct the sitemap directory properly or it will look like spam.
My money site is a real estate site with rougly 50,000 listings with pagination to show the listing as well as a map and other stuff. So in total my website has roughly 350k pages. I use sitemaps to get them indexed, currently I have 8 sitemaps setup in a sitemap directory page with 100 links per page. So 8 sitemaps = 6*50k Links at 100 links per page = 500 sitemap pages per sitemap.
This is completely whitehat and Google has some nice tech information about sitemaps. they say keep the number of links at 100 on each page and break sitemaps into 50k URL's or less.
My other two sitemaps are for my normal content pages and a video sitemap.
I have 308k pages indexed right now, and it's due to the strength of my site + link network I have built up over the last 3 years.
As to your original question, how to get that many pages indexed: Sitemaps.
thanks all, i got the feeling the only 1 subdomain pages was indexed fast and google was giving all the effot for this subdomain. when it over now, i can see that all other subdomains starting to be crawled.... i thought subdomains are like diffurent domains on google eyes, but maybe this is not 100% true.
I've got a site with 155K pages indexed. All scraped content but each pages pulls in content from various sources so it looks pretty unique.
Site structure and unique content makes a huge difference - backlinks obviously help as well.
If your pages frequently update their content that also helps a lot.
Separate names with a comma.