How to make google forget 100,000 pages

jamie3000

Elite Member
Jr. Executive VIP
Jr. VIP
Joined
Jun 30, 2014
Messages
7,450
Reaction score
4,728
I'm in the early stages of a large price comparison website. I submitted a sitemap with 100,000 pages to Google, it started indexing my pages and everything was great.

Then I completely changed all the content and pages on my website. I now need google to forget the 100,000 pages I asked it to index and index 30,000 new correct pages with different URLs instead.

I've removed the old sitemaps and submitted the new ones. So far I've served up 410 to Google for anything that is no longer there (the old pages) while I can see it indexing the new pages (in my server logs) its only trying to grab a new page 1 in every 20 requests, the other 19 are for the old pages that I now serve up a 410 for.

How can I make Google forget all those old pages? Or do I just have to have my crawl budget used up until it gives up on them?

Thanks
 
I reindex them 1 by 1 in url inspector, but you'll need an army of VA's for that.

I tend to have trouble getting google to forget 10 pages without the manual 1 by 1 reindex.

I'd suggest emailing support, but then again from my experience they are not human.

Anway, following this thread cause I'd like to know the answer too...
 
I think you'll have to wait for them to be recrawled.

An interesting experiment would be to redirect the old pages to the new ones, that might speed things up as you deindex and index at the same time.
 
I guess you need just wait crawler will do his work.
Your sitemap correct, without old pages? and how about robots.txt? Theoretically, you can send old pages to a separate folder by specifying not to index these pages. Is not it so?
 
Back
Top