G indexed 6k duplicate pages, affecting rankings. Start over?

Discussion in 'Black Hat SEO' started by pixelgrinder, May 7, 2015.

  1. pixelgrinder

    pixelgrinder Jr. VIP Jr. VIP

    Apr 1, 2008
    Likes Received:
    I started another ecommerce site 4 months ago. About a week ago I noticed google indexed about 6k duplicate pages that the ecommerce CMS generated (category pages). I thought I had put the noindex tag on these pages but I apparently did not. As soon as all of these pages were indexed, all of my rankings have disappeared.

    I deleted the pages in the CMS so they're all 404 now but they're still being indexed even after being crawled again. I tried webmaster tools to try to remove some of the pages but they get processed and then never removed from the index.

    The dilemma is, should I start over with a new domain or try to remove these pages/let it run its course? I have built a few really authoritative whitehat links that would be difficult to change to a new domain. I don't want to invest time/money into this domain going forward and then always have these pages affecting my rankings. From some preliminary searching, some people report their 404 pages still being indexed up to 6 months after removal.

    What should I do?
  2. mainceaft

    mainceaft Regular Member

    Apr 10, 2013
    Likes Received:
    first use robots.txt to disallow G crawling tags . In WMT there is URL query string tools change status of this query to No Index or something close .
    for 404 pages G already said We have very strong memories this pages will remains for years on G DB .
    Last you can use strong methods changing status of tags to 401 - Gone to give G bot clear message this page will never get back online again .