I need help to setup 2 doamins on a single website, and pervent google to know or lose rank

MrFollower

Newbie
Sep 21, 2019
8
1
Hey
Here is the problem,I'm getting all my traffic from Google search and now my country is filtering my domain every 6 months so I need to change my doamin in search console ( using change domain option ) so my users from my country can have access to the website, the problem is only access from my country will be restricted to the domain but the other countries still can have access to the filtered domain include google bots
We've been doing this for years and now I decided to do something different
I've bought 2 domains example.com and example.net and connect them both to the wesbite and both of them are working fine without any mixed content or anything else
Example.Com ( This is the main domain which has been allowed to crawling and index meta tags, the domain that google can see it and put it in results )
Examaple.Net ( This is second domain which has been setup only for my country users, it has a different robots.txt fot pervent crawling and noindex meta tags )
I've set a rule on cloudflare and redirecting ( 302 temporary ) users from my country ( geo location rule ) to the second domain which is Example.Net and since I've done this we are losing rank day by day
It was hard for us to change domain every 6 months because it's an e-commerce and everytime we lose a lot sales, since I've setup this two domain strategy everything is ok but the problem is we are losing clicks everyday
so I need your help with this
1. Is this ok to do this at all or not ?
2. Does google knows that I'm redirecting users to another domain since I'm sure google doesn't have any bots in my country to know it
3. What do you suggest ? We need to implement this setup because we are tired of getting filtered every 6 month, we need to setup seperate domain for our country users so if the second domain got filtered we can change it easy and don't need to change the domain in search console and lose all clicks, so in this setup we can keep our rank and we can change the second domain in case of filtering
thanks for your help in advanced
I've searched a lot about this and couldn't find anything so your reply would be so so so much helpful
 
What if I allow google to crawl the second domain via robots.txt but not to index it and set a canonical url to the main doamin ? does it a good implementation ?
<meta name="robots" content="noindex, follow" />
<link rel="canonical" href="https://www.example.com/page-url" />
I think google will understand I'm redirecting users to another domain and it may impact my rank cause google thinks I'm hiding something
I need to have 2 domain and redirect some users to the second domain but keep my main doamin indexing and ranking on google search somehow
I need your help, I told you what's my problem I need to do this badly but in the proper way
 
You can use mod_ip2location module(can be found on github) for your apache to manage customers from different locations.
 
You can use mod_ip2location module(can be found on github) for your apache to manage customers from different locations.
unrelated answer, not my problem
I think u didnt read my post at all
I've implemented the redirect
my problem is click drops now, and I'm worry about google
my question is about SEO
 
unrelated answer, not my problem
I think u didnt read my post at all
I've implemented the redirect
my problem is click drops now, and I'm worry about google
my question is about SEO
Sorry bro, I've read your post at full.
I did get you a little wrong.
I had a similar situation, but i've had 2 hosts and 2 domains with the same content. (not 2 domains pointing to one host as you have)
I've used apache mod_ip2location with 301 redirect (but 302 should be fine for goole too) to forward users to second domain.
I have also restricted google from indexing the second-domain.com website using robots.txt.

So there was only first-domain.com in google index and that solutions worked fine for me with no penalties in ranking.

Probably the problem is that google bot is indexing the content using your second domain and somehow it's being wrongly indexed.
I would suggest you to try to restrict google bot from indexing the second domain.

You can do it like this:
RewriteEngine on
RewriteBase /
RewriteCond %{HTTP_HOST} ^second-domain.com
RewriteRule robots.txt robots-that-forbids-all.txt [L]
 
you can use a cname record for the 2nd domain and cloack the users only for the 2nd domain in your country & g bots to your original,
by that, your final result should be that domain 1/2 will be showing the same content with canonicals pointing to your 1st domain,
users from your country will be visiting domain 2 and g bots will only be able to crawl and index the first domain,

now, the downside comes to the actually collected data from chrome which has been used for the core web vitals showing in GSC,
as well as it has been revealed as a ranking factor from the recent leaks,
it won't be as problematic as the canonicals are pointing to the first domain, but well, that can cause you problems, the rest is up to testing things
 
you can use a cname record for the 2nd domain and cloack the users only for the 2nd domain in your country & g bots to your original,
by that, your final result should be that domain 1/2 will be showing the same content with canonicals pointing to your 1st domain,
users from your country will be visiting domain 2 and g bots will only be able to crawl and index the first domain,

now, the downside comes to the actually collected data from chrome which has been used for the core web vitals showing in GSC,
as well as it has been revealed as a ranking factor from the recent leaks,
it won't be as problematic as the canonicals are pointing to the first domain, but well, that can cause you problems, the rest is up to testing things
Here I what I did now and it's running up
I've did a CNAME from my second fomain to the first domain , and redirecting my country users to the second domain and I create a seperate robots.txt for second domain to pervent crawling and set noindex headers as well to provent indexing second domain ( using some custom php code )
another approach is:
I can allow crawling the second domain and set noindex tags ( or maybe index , I don't know exactly ) but set canonical urls to the first domain , this is the way google can see all content from second domain but as I set canonical url to first domain it will keep indexing the first doamin which is my main domain

which one is better ?
 
Redirecting users based on geo-location is generally fine, but it can impact rankings.
Google might detect geo-specific redirects, potentially affecting rankings.
Consider using a subdomain (e.g., country.example.com) for your country users instead of a separate domain to maintain SEO benefits while addressing filtering issues. Optimize content and ensure consistent user experience across domains.
 
I've did a CNAME from my second fomain to the first domain , and redirecting my country users to the second domain and I create a seperate robots.txt for second domain to pervent crawling and set noindex headers as well to provent indexing second domain ( using some custom php code )
there's no point to set canonicals & noindex tags if you're blocking google bots as they can't simply crawl & check the page
I can allow crawling the second domain and set noindex tags ( or maybe index , I don't know exactly ) but set canonical urls to the first domain , this is the way google can see all content from second domain but as I set canonical url to first domain it will keep indexing the first doamin which is my main domain
i would personally go with this, noindex is fine as you're not trying to rank the website & bots still can crawl the pages,
the outcome is unpredectable, but theorically i would say this is the best approach
 
There are successful cases when websites are indexed just fine while being present on 2 different domains depending on the country. Tilda is a good example, with tilda.cc domain for global market, and tilda.ru for Russian market (as its origins and many customers are in Russia)

Of course they have strong product, but a fact is still a fact - high ranking is pretty feasible
 
Redirecting users based on geo-location is generally fine, but it can impact rankings.
Google might detect geo-specific redirects, potentially affecting rankings.
Consider using a subdomain (e.g., country.example.com) for your country users instead of a separate domain to maintain SEO benefits while addressing filtering issues. Optimize content and ensure consistent user experience across domains.
I can't make a subdomain cause my country is filtering my domain in dns layers since my ecommerce is selling cigarattes, so the only option is using a second domain and redirect users to it
 
There are successful cases when websites are indexed just fine while being present on 2 different domains depending on the country. Tilda is a good example, with tilda.cc domain for global market, and tilda.ru for Russian market (as its origins and many customers are in Russia)

Of course they have strong product, but a fact is still a fact - high ranking is pretty feasible
Yes you are true, but my case is different
google has a policy for multi region - multi language websites and different domains are crawling and indexing by google in multi region websites like amazon and others also sometimes they provide different content like tilda , the language text is different
but in my case my both domains are poiting to the same host , so the content are same, exact same !, I need to do something about it or google will penalty it as a duplicate content, also I don't want to my second domain being indexed but still need to redirect users to it and keep my first domain in ranking without any rank loss or anything else
 
there's no point to set canonicals & noindex tags if you're blocking google bots as they can't simply crawl & check the page

i would personally go with this, noindex is fine as you're not trying to rank the website & bots still can crawl the pages,
the outcome is unpredectable, but theorically i would say this is the best approach
clear me again, sorry for asking this many times because I wanted to be sure and once u said there is no point to set canonical & noindex
so here is what I need to do
1. allow crawling in my second domain robots.txt
2.set index and follow tags in second domain ( make me sure about this, for index and follow or noindex and nofollow )
3. set canonical urls from my second domain to the first domain ( main domain )
4. redirect my country users to the second domain using 302 redirect ( or 301 ? )
 
clear me again, sorry for asking this many times because I wanted to be sure and once u said there is no point to set canonical & noindex
so here is what I need to do
1. allow crawling in my second domain robots.txt
2.set index and follow tags in second domain ( make me sure about this, for index and follow or noindex and nofollow )
3. set canonical urls from my second domain to the first domain ( main domain )
4. redirect my country users to the second domain using 302 redirect ( or 301 ? )
2. set noindex & follow for the 2nd domain
4. 301 should be fine, 302 should work as well
everything else seems good to go,

you're trying that on your own risk, i didn't try this before,
but, if you're already losing traffic and changing domains every X months and ranking well easily the risk is not as big,
it's surely worth the try for a long term process
 
but in my case my both domains are poiting to the same host , so the content are same, exact same !, I need to do something about it or google will penalty it as a duplicate content
Hmm, that really sucks... Have you tried reaching out to Google support? It's not like your situation is against the rules or something, and Google specialists may at least give you some guarantees and useful advise
 
Hmm, that really sucks... Have you tried reaching out to Google support? It's not like your situation is against the rules or something, and Google specialists may at least give you some guarantees and useful advise
Yes, I've created some post on their community but I couldn't find any way to talk to them
the only option was google search community, and no body answers
 
Last edited:
Yes, I've created some post on their community but I couldn't find any way to talk to them
the only option was google search community
Google classic... so eco-friendly in words yet still so unaccessable in practice. Well, if you wish to go in that direction, you can try contacting Google Search specialists via social media, for example, LinkedIn. You can write to 10 or 20 people, and receive feedback from 1-2)
 
Back
Top
AdBlock Detected

We get it, advertisements are annoying!

Sure, ad-blocking software does a great job at blocking ads, but it also blocks useful features and essential functions on BlackHatWorld and other forums. These functions are unrelated to ads, such as internal links and images. For the best site experience please disable your AdBlocker.

I've Disabled AdBlock