1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Domain tech information - surviving SEO Spam

Discussion in 'Black Hat SEO' started by blazinec, Aug 12, 2016.

  1. blazinec

    blazinec Newbie

    Joined:
    Dec 20, 2012
    Messages:
    5
    Likes Received:
    0
    Hello friends, I wanted to know, what it needed to create kind of website similar to
    http://www.mysitesview.net
    http://www.trafficestimate.com
    http://www.siteprice.org
    http://www.backingseo.net
    http://www.similarsites.com/
    etc.. these sites survive all the animals from big G. My question is, how would you create such type of a project, when you have all the data available. I am interested in specific actions needed to survive on G's index. I have tried similar site like this, but it's almost impossible not to be deindexed just few weeks after publishing. It's not some kind of long-term visions for me, but it's pretty interesting that automated pages still survive in index a have very high Alexa ranks sometimes. To be specific:
    1) How would you publish ,,pages" = tests to G's index? Imagine you have years of data for over 300M domains and you are able to show them all without months of work.
    2) Do you think it's needed to have some timeframe, eg. first months after domain registration index just first 10k results, then more etc...
    3) Is there any magic behind the sites ? I was never trying to buy any links, just create the project and run it for a test what big G will do with it.
    4) Do you think it's suitable strategy to pull all domain names you know something about, or hand pick just some of them, that receive some portion of search traffic (according to Semrush etc) in order not to be too visible (as I think domain with 10M pages in index is highly suspicious).
    Thank you for your opinions!