How to hide a website?

SeoNews

Junior Member
Joined
Jun 1, 2016
Messages
172
Reaction score
43
Hi!

I'm trying to find a find on how to hide a website only for human visitors and not for search engines.

What i mean: When the site is visited by a human, then he will see nothing (an error, a blank page, ...).
Only the search engines should see the right version and be able to index it.

And how to do this in the safest way? without risking penalty from search engines for cloaking?
(I'm not trying to do something shady or illegal, i just dont want the visitors to see my site for a while)

Is it possible?

Thanks all in advance for any help!!
 
In your robots.txt in your file manager..

  1. User-agent: Googlebot
  2. Disallow: /index.php/
 
I'm not and expert in robots.txt, but i think this is the opposite to want i to do, and it doesnt hide it.
 
You could hide the content with CSS styling, white font on a white background, or something similar. Becomes harder the more complex you want the site to be.
 
Wrap all your site over a white color and disable the right click so most people won’t be able to go over inspect element.
 
Wrap all your site over a white color and disable the right click so most people won’t be able to go over inspect element.
You could hide the content with CSS styling, white font on a white background, or something similar. Becomes harder the more complex you want the site to be.

Thanks! Yes i guess this is the easiest solution, but wont this be a flag for google to ban this site? This is similar to a black hat technique for kw stuffing, no?
 
Just wrap the site with css white front. and content behind it. easy.
 
Hi!

I'm trying to find a find on how to hide a website only for human visitors and not for search engines.

What i mean: When the site is visited by a human, then he will see nothing (an error, a blank page, ...).
Only the search engines should see the right version and be able to index it.

And how to do this in the safest way? without risking penalty from search engines for cloaking?
(I'm not trying to do something shady or illegal, i just dont want the visitors to see my site for a while)

Is it possible?

Thanks all in advance for any help!!

Code:
User-agent: *
Disallow: /
Use this Robots.txt
 
Via a robots noindex meta tag. Using a robots noindex meta tag to prevent search engine bots from indexing particular pages is both effective and easy.
 
You could hide the content with CSS styling, white font on a white background, or something similar. Becomes harder the more complex you want the site to be.
You can rank for keywords you have in white text on a white background so I doubt this does anything.
 
You can rank for keywords you have in white text on a white background so I doubt this does anything.

I guess going with white text on white background may have nasty consequences with google, 50/50 chance.
 
If you use WordPress you can use a "site on maintenance" plugin. Some of them have the option to allow search engines to crawl your site while real visitors get a site under construction page.
 
And how to do this in the safest way? without risking penalty from search engines for cloaking?

There's no way to do this safely, there will always be a site wide manual action penalty risk - if G catches you, your site will be de-indexed
 
I guess going with white text on white background may have nasty consequences with google, 50/50 chance.
Not something I would use on my money site really, but it does work. Speaking from experience.
 
If you use WordPress you can use a "site on maintenance" plugin. Some of them have the option to allow search engines to crawl your site while real visitors get a site under construction page.

Hmm Interesting! Could You please share the plugin name? i tried a few, but didn't see this option.
Thanks!
 
Actually are you looking for cloaking right? If the user searched those pages it will show as error page when search engines looks that page will be shown am i correct?
 
Back
Top