Ecommerce site is ranking but site is block google crawler

potluck_hoe

Newbie
Mar 21, 2022
8
5
I have been doing competitor analysis and noticed that one of our competitors is ranking very well despite having a low Domain Rating (DR).

I ran their site through Screaming Frog, but it didn't work, showing response codes: Internal Client Error (4xx). I used an extension to simulate different crawlers, including Googlebot, and found that the site is not accessible by these crawlers.

Does anyone know about this type of strategy? If you search "shrooms online (Canada)," the second-ranking e-commerce site is the one I'm talking about.

Thank you.
 
I think your competitor is using a strategy of "cloaking". i.e., website can detect the user agent of the crawler and serve a different version of the page or block it entirely. This would explain why you're seeing a "4xx" error when using Screaming Frog or other crawler simulations.
 
I think your competitor is using a strategy of "cloaking". i.e., website can detect the user agent of the crawler and serve a different version of the page or block it entirely. This would explain why you're seeing a "4xx" error when using Screaming Frog or other crawler simulations.
aside form cloaking , do you think they do other server-side measures ?
 
Back
Top
AdBlock Detected

We get it, advertisements are annoying!

Sure, ad-blocking software does a great job at blocking ads, but it also blocks useful features and essential functions on BlackHatWorld and other forums. These functions are unrelated to ads, such as internal links and images. For the best site experience please disable your AdBlocker.

I've Disabled AdBlock