1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Why some websites on google rank didn't have google cache?

Discussion in 'Black Hat SEO' started by GodwinLin, Jul 1, 2017.

  1. GodwinLin

    GodwinLin Newbie

    Joined:
    Jun 28, 2017
    Messages:
    3
    Likes Received:
    0
    [​IMG]

    i know how to hijack google spider, but this way always have the google cache.

    as the second christminster-singers.org.uk this website.

    no cache, no similar. but the google title,description is blog content, and this rank can keep very long time.

    so i wonder how this happen? can somebody tell to how to do like this?
     
  2. snaip

    snaip Registered Member

    Joined:
    Oct 3, 2014
    Messages:
    96
    Likes Received:
    28
    Gender:
    Male
    put this meta tag in the header of your page
    Code:
    <meta name="robots" content="noarchive">
    
     
    • Thanks Thanks x 1
  3. Chandhu

    Chandhu Junior Member

    Joined:
    Sep 17, 2014
    Messages:
    103
    Likes Received:
    16
    Gender:
    Male
    Anyway to access these unarchived pages in Google?
     
  4. GodwinLin

    GodwinLin Newbie

    Joined:
    Jun 28, 2017
    Messages:
    3
    Likes Received:
    0

    thanks for reply, i think do like this the website on google rank can be "no cache".

    but with the hijack content, i use Firefox plugin <User Agent Switcher> change the Search Robots to GoogleBots 2.1 still can't see the hijack content.

    :( is this some kinds of new hijact technology? i confused of this months.
     
  5. GodwinLin

    GodwinLin Newbie

    Joined:
    Jun 28, 2017
    Messages:
    3
    Likes Received:
    0
    only few days(5~7 days) in the beginning when the domain had been hijack can see the content, after these days then no way to see the hijack content.:(

    it's a blog(wordpress) Mirror Site.
     
  6. snaip

    snaip Registered Member

    Joined:
    Oct 3, 2014
    Messages:
    96
    Likes Received:
    28
    Gender:
    Male
    I don't think there is a new method, they're using better cloaker.
    There are different cloaking methods - you can cloak against user agents, ip addresses or both.
    User agent based cloaking is not reliable and I'm quite sure the sites you were trying to check are using IP based cloaking.

    @GodwinLin please check your PM