Hi,
I just caught two IPs calling random pages on my cloaked domains. (Always a random page on a domain, then another domain.)
Both IPs came from the Google IP range (74.125.0.0 - 74.125.255.255) but have no reverse-DNS entry and are their for, afaik, not crawlers. (Actually their IPs came from this IP-range: 74.125.75.*)
Their referrers always followed the same schema: "http://www.google.com/search?hl=en&q=example.com" where example.com matched the request URL and their UserAgent was on both IPs: "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.0.7) Gecko/20060909 Firefox/1.5.0.7"
I banned both IPs, because I think those are manual reviewers checking out my sites.
My question now is: Did you ever experience this and if yes, what did you do? I'm thinking about denying the whole Google-IP-range, except for the crawlers. In my opinion, a timeout for human reviewers is better than a redirect to an affiliate offer.
Another idea would be to deny all requests with a referrer with the given schema. This would be effective, but not future proof.
Ideally I would display them a wh site, but this is not gonna happen until I find a way to create wh sites on the fly
What would you do?
I just caught two IPs calling random pages on my cloaked domains. (Always a random page on a domain, then another domain.)
Both IPs came from the Google IP range (74.125.0.0 - 74.125.255.255) but have no reverse-DNS entry and are their for, afaik, not crawlers. (Actually their IPs came from this IP-range: 74.125.75.*)
Their referrers always followed the same schema: "http://www.google.com/search?hl=en&q=example.com" where example.com matched the request URL and their UserAgent was on both IPs: "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.0.7) Gecko/20060909 Firefox/1.5.0.7"
I banned both IPs, because I think those are manual reviewers checking out my sites.
My question now is: Did you ever experience this and if yes, what did you do? I'm thinking about denying the whole Google-IP-range, except for the crawlers. In my opinion, a timeout for human reviewers is better than a redirect to an affiliate offer.
Another idea would be to deny all requests with a referrer with the given schema. This would be effective, but not future proof.
Ideally I would display them a wh site, but this is not gonna happen until I find a way to create wh sites on the fly
What would you do?
Last edited: