1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Need help to protect client sites from Google deindexing

Discussion in 'Black Hat SEO' started by softi, Apr 20, 2012.

  1. softi

    softi Newbie

    Joined:
    Apr 20, 2012
    Messages:
    11
    Likes Received:
    1
    I am going try and keep this short but I need help and advice please!

    My main business website appeared on the first page for my main keyword (related to the web industry) in my country for about 5 years. I used blog networks a lot to promote client sites, including BMR and I used BMR to promote my main business site as well. I also had about 80 sites in another big blog network as donor sites.

    In March - the Unnatural Links message to my main site and the -50 penalty for that site only.

    Mid April - my main business website, my 80 blog donor sites, my affiliate sites AND about 5 of my innocent clients' sites deindexed (meaning, these were web design clients with little or no SEO done for them. Business sites - in other words, pure business content, all original, good designs, proper business related sites - not related to IM or money making or anything). 3 of them did have some SEO done - mostly article marketing.

    How - the only way was through a manual review and finding common WHOIS info, common registrar and common servers (reverse IP lookups for other sites on the same server). I am convinced that they found my main business site through the BMR links, and from there onwards lookups on the servers and whois for the rest.

    Please don't blast me with how stupid it was not to hide my whois or at least to fake them - I know that now.

    My main problem is now that I have 5 clients who host with me who had their sites deindexed purely because of the association with my main site (sharing whois info on the same server)

    I have about another 40 clients on the same server who are also at risk - most are pure web design clients with no SEO done to them, about 2 are current SEO clients.

    With regards to the already deindexed clients I am going to create separate Webmaster accounts for them as if coming from the sites themselves and submit a reinclusion request to Google, saying that they don't understand why they have been deindexed. Any comments or advice?

    With regards to the sites on the same server as the main culprit site - this is a dedicated server JUST with my main site and my client sites on. What should I do to protect them? Remember, about 5 of these 40 sites were already deindexed, with most of them completely innocent of any wrongdoing.

    A couple of options come to mind but not all are feasible:
    1. Immediately apply a whois guard for the ones that can have whois guard (some are registered through a country level registrar who does not offer the service, but then neither do these sites appear in internation whois records in any case). Google is a registered registrar - can its API's bypass the WHOIS?
    2. Instead of the WHOIS guard, just change the whois records to remove my technical/admin contact details. Will this work or is there a history associated with WHOIS?
    3. Move the accounts to a different shared hosting server that I have with the same hosting provider. This will be easier to justify than moving them to various other server accounts (these are business clients, they don't take easily to having their hosting accounts moved since it disrupts their emails etc and I have JUST gone through a major server migration with them). But if Google has already earmarked them in some or other way for being associated with a 'bad apple' - won't this just move the problem with them?
    4. Move the accounts each to a different server account - least desirable option since this is a big business disruption. Also, if I take into consideration that A LOT of my blog network sites and my affiliate sites were spread across various servers, it looks to be more a case of common 'whois' rather than common servers although the server definitely played a role as well.

    Any comments? Thanks for your time and patience....
     
  2. Web Echo

    Web Echo Regular Member

    Joined:
    Apr 5, 2012
    Messages:
    328
    Likes Received:
    125
    Location:
    Online
    1. & 2. Don't do it. Instead of it, use real whois info of your clients. Don't worry about history.

    3. & 4. Moving to other server is good. But just to be on safer side you should go with different servers from different hosting providers.

    You need to create an impression that your clients are not happy with you and they have moved to some other hosting.

    After that wait and see if how it works for your clients. If you still see problem then submit re-indexing request to google using clients email address (act like a client).
     
  3. pirondi

    pirondi Power Member

    Joined:
    Jan 5, 2010
    Messages:
    562
    Likes Received:
    118
    There is two points to be made:

    1:You cant stop the deindex risky of sites when you use blackhat.
    2:But you can minimize the problems,if google cant harvest your sites.

    About number 1 i will not discuss it here,it is a complex subject. Number 2:

    1:Host them using different ips,or use shared hosts like 10 sites per shared host.
    2:No google analitics,use other tracking method,and make sure that it dont have footprints.
    3:Whois:Make it private. (Observation:We dont know if google eventually will or have acess to private whois,so better safe than be sorry,make different whois for each site if possible,if you do this obviously there would be no need to private whois).

    About the remaning sites that were not hit,they are not on the "do it later zone",so remove all footprints that they have to prevent future problems.