Spamdexing help me

fatihodaci

Newbie
Joined
Jul 8, 2020
Messages
15
Reaction score
4
Hello, there are many sites with sneaky redirect in my country. In fact, it is: There is a problem supported by the indexing api installed in the subdomain working with the sneaky redirect redirect. While Google bots see the site differently, normal users see different sites. Manual penalty does not help because while it gets 800k indexes in 5 days, it can be opened countless times every day.

I have read your valuable information. We need to finish these sites in Turkey. What can I do? Also, disabling text blocking, copying, or right-clicking is not the solution either. We come across especially .cz or .fr sites.

Please share if you have any ideas about it. It does not appear on English or other language sites, only Turkish sites have this situation. We complained and couldn't find a solution.
 
Abi are you referring to Cloaked websites? And why are you trying to finish them, are they outranking you?
 
Focus on your own site. If Google can't do anything about it, you certainly can't. Every country has this problem. 1 a bit more than the other.
 
Abi are you referring to Cloaked websites? And why are you trying to finish them, are they outranking you?
Yes, there is a website with an average of 200,000 subdomains, each consisting of a single page. They are being redirected through Cloudflare and are using country code top-level domains such as .cz, .fr, and .it. When Google shows the normal index, it displays prohibited content to users.
 
Focus on your own site. If Google can't do anything about it, you certainly can't. Every country has this problem. 1 a bit more than the other.
Ye, this problem exists in some countries, but it is especially prevalent in Turkey and I want to do something about it. Methods such as right-click blocking, hotlinking, and IP blocking are not effective. They are still able to steal content from the website, and it's ultimately our hard work that goes to waste.
 
Yes, there is a website with an average of 200,000 subdomains, each consisting of a single page. They are being redirected through Cloudflare and are using country code top-level domains such as .cz, .fr, and .it. When Google shows the normal index, it displays prohibited content to users.
Then that's google's responsibility. If they had a working system in the first place, cloaked spam sites wouldn't survive another day. I also hate those sites but no solutions for now.

They exist in every language but since ranking in Turkiye is very easy and the competition is very low, black hat websites rank their eyes folded.
 
Then that's google's responsibility. If they had a working system in the first place, cloaked spam sites wouldn't survive another day. I also hate those sites but no solutions for now.

They exist in every language but since ranking in Turkiye is very easy and the competition is very low, black hat websites rank their eyes folded.
No, this is not a solution. I am looking for a solution. I do not want to run away or accept their behavior. Someone must have encountered this problem before and found a way to overcome it. I need their ideas. Also, based on my analysis, they are not able to do this for every keyword group. There is only one website with 200,000 subdomains. Assuming there are 20,000 such websites, it would make up for 4 billion subdomains. This is a huge problem.
 
Ye, this problem exists in some countries, but it is especially prevalent in Turkey and I want to do something about it. Methods such as right-click blocking, hotlinking, and IP blocking are not effective. They are still able to steal content from the website, and it's ultimately our hard work that goes to waste.
If you want to prevent them to scraping your content try https://www.cloudflare.com/products/bot-management/ or other scraping protection. It is almost impossible to block scraping 100% but at least you can make it more difficult for them.
 
Can you tell me if there is a method that I can try to protect myself from these websites and prevent them from stealing my content, as if you were a native English speaker?
 
If you want to prevent them to scraping your content try https://www.cloudflare.com/products/bot-management/ or other scraping protection. It is almost impossible to block scraping 100% but at least you can make it more difficult for them.
I have emailed Cloudflare and also submitted a complaint form regarding the issue. I am now uploading a video to show what they are doing. If you're interested, you can take a look. Thank you for your response as well.

https://veed.io/view/f386da67-bd2d-46b6-83c6-fc56a255de38
 
No, this is not a solution. I am looking for a solution. I do not want to run away or accept their behavior. Someone must have encountered this problem before and found a way to overcome it. I need their ideas. Also, based on my analysis, they are not able to do this for every keyword group. There is only one website with 200,000 subdomains. Assuming there are 20,000 such websites, it would make up for 4 billion subdomains. This is a huge problem.
The only thing you can do is to prevent them from scraping your content. You can reverse-cloak your website for their user agent. Detect their user agents and block/cloak them, or just use a ready-made solution like Cloudflare.

But ending them is out of the table, it's impossible unless you are google.
 
The only thing you can do is to prevent them from scraping your content. You can reverse-cloak your website for their user agent. Detect their user agents and block/cloak them, or just use a ready-made solution like Cloudflare.

But ending them is out of the table, it's impossible unless you are google.
Thanks. All my friends who are facing this problem also use Cloudflare. They are also coming through Cloudflare. interestingly, they are able to get indexed very quickly using IndexAPI and we are falling behind. thank you very much
 
Hopefully reverse cloaking will help you. In which niche do you see this websites? And will they hotlink your images or only use your content?
 
Hopefully reverse cloaking will help you. In which niche do you see this websites? And will they hotlink your images or only use your content?
As I said, I'm talking about 4 billion subdomains. They fetch images from the Bing website and share our texts after spinning them a little. They are able to rank high because of their quick indexing. I can say that brand sites are not affected by this situation.
 
As I said, I'm talking about 4 billion subdomains. They fetch images from the Bing website and share our texts after spinning them a little. They are able to rank high because of their quick indexing. I can say that brand sites are not affected by this situation.
Cloudflare only way to stop it.
Sorry they can stop it only fullstop there no other solution.

You can report the main domain to the provider that it.

There no magic trick stopping anyone copying your content why Google use duplicate penalty
 
A sub domain is indexed as a normal domain even thu it the sub of the main domain.

Google see all domains as a real domain.

But if the main domain get blocked so will all the sub domains.

You can use who is info or ip info of a sub and report it to Google.

Even if the domain on private can report a problam to the provider and Google, as domain using spammy crap.
 
Well, the main problem is - if you want google and users to see your content, everyone can steal it. You can make it kind of harder, but with solid proxies and legit usernames, you don't stand a chance.
You can try to disallow every user outside of turkey - but turkish proxies are cheap.

Best way to handle black hat tecniques ist not trying to stop em, but to copy em and bank yourself until the method will not work anymore.

btw - I don't think they use indexing API but sitemaps. indexing API has limited requests, so this is not really a option for large scale spam projects.
 
Cloudflare only way to stop it.
Sorry they can stop it only fullstop there no other solution.

You can report the main domain to the provider that it.

There no magic trick stopping anyone copying your content why Google use duplicate penalty

This is more of a spam at a higher level than you think. We don't know where the domain is purchased from, it's purchased from countries like Russia and Northern European countries. Ultimately, extensions like .cz, .pl can be purchased from anywhere. There are 200 thousand subdomains on just one site. Which one will Google block access to? These sites can reach 500k hits in just 4 days.


eqweqweq.png
 
Well, the main problem is - if you want google and users to see your content, everyone can steal it. You can make it kind of harder, but with solid proxies and legit usernames, you don't stand a chance.
You can try to disallow every user outside of turkey - but turkish proxies are cheap.

Best way to handle black hat tecniques ist not trying to stop em, but to copy em and bank yourself until the method will not work anymore.

btw - I don't think they use indexing API but sitemaps. indexing API has limited requests, so this is not really a option for large scale spam projects.
Yes, it's highly likely that they are using a sitemap to take action. I agree. Do you have any ideas about this? Thank you, my friend.
 
It is not just about scraping, indexing, it is about cookies too, cause otherwise they can't rank so fast. Beside the cloacking which is strong too. The problem persists on same level in all countries not just in Turkey.
 
Back
Top