1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

[Guru Secret]Get .edu/.gov/.anything do follow backlinks | Referer spam

Discussion in 'Link Building' started by booster2ooo, Feb 24, 2010.

  1. booster2ooo

    booster2ooo Junior Member

    Joined:
    Jan 17, 2009
    Messages:
    104
    Likes Received:
    30
    [ToAdmins/Mods] I just realized I posted in the WhiteHat section, please move this thread to BLACKHAT, thank you very much[/ToAdmins/Mods]

    Hello BHW !

    I recently read a technique for getting .edu/.gov do follow backlinks. This method is known by viagra spammers and allow you to get **************** from .edu, .gov or any other TLD domain.

    The concept
    Nowadays, a lot of websites use statistics scripts. Those scripts output informations like number of visitors, their OS, their browser and their REFERER. There are a lot of scripts that act like this, I'll take the exemple of WebAnalizer.
    [​IMG]



    How to find websites using WebAnalizer?

    As always, Google is gonna be our friend is this process. Using the appropriate dorks, you'll find tons of websites using this script.

    The simple dork:
    With this dork, you'll get a list of websites using WebAnalizer. As you can see, we specify a date in the webserver file (inurl). You can change this date to get other results (eg: inurl:usage_201002.html for February).

    The .edu dork
    As you can see, it's almost the same search request than the previous one, but with on more argument -> "site:edu". With this param, you'll be able to find only .edu websites using WebAnalizer.

    The .gov dork:
    Same than the previous one, but using site:gov in order to find .gov websites.

    Other lang dork
    You may use an other keyphrase in order to get backlinks from foreign languages websites. In this exemple, it's for French websites.

    Using those dorks and a Google results extractor, make a list of URL you want to spam.


    How to spam ref ?
    Here is the best part: the spam script. Here are two scripts that are going to simulate visits to the websites URLs saved in the previous step. I won't explain how to use them, it's quite obvious (it also allow to use proxies).

    The PHP version:
    Code:
    <?php
     
    @set_time_limit(0);
    error_reporting(E_ALL | E_STRICT);
    ini_set('display_errors', true);
     
    function CurlSpam($proxy, $proxyprotocol, $referer, $spamsite) 
    {
       $useragent = 'Mozilla/5.0 (X11; U; Linux i686; fr; rv:1.9.1.1) Gecko/20090715 Firefox/3.5.1';
       $header = array(
    				"Accept: text/xml,application/xml,application/xhtml+xml, text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5",
    				"Accept-Language: fr,fr-fr;q=0.8,en-us;q=0.5,en;q=0.3",
    				"Accept-Charset: ISO-8859-1;q=0.7,*;q=0.7",
    				"Keep-Alive: 300"
                            );
     
        $ch = curl_init();
     
        curl_setopt($ch, CURLOPT_FOLLOWLOCATION, false);
        curl_setopt($ch, CURLOPT_HEADER, true);
        curl_setopt($ch, CURLOPT_VERBOSE, true);	
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
        curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 15);
        curl_setopt($ch, CURLOPT_TIMEOUT, 15);
        curl_setopt($ch, CURLOPT_COOKIEJAR, 'cookieSpam');
        curl_setopt($ch, CURLOPT_COOKIEFILE, 'cookieSpam');
        curl_setopt($ch, CURLOPT_URL, $spamsite);	
        curl_setopt($ch, CURLOPT_REFERER, $referer);
        curl_setopt($ch, CURLOPT_USERAGENT, $useragent );
        curl_setopt($ch, CURLOPT_HTTPHEADER, $header );
     
        if ( $proxy != '')
        {
    		curl_setopt($ch, CURLOPT_HTTPPROXYTUNNEL, 1);
    		curl_setopt($ch, CURLOPT_PROXY, $proxy);
    		if ( $proxyprotocol == 'socks4') curl_setopt($ch, CURLOPT_PROXYTYPE, CURLPROXY_SOCKS4);
    		else if($proxyprotocol == "socks5") curl_setopt($ch, CURLOPT_PROXYTYPE, CURLPROXY_SOCKS5);
    		     else curl_setopt($ch, CURLOPT_PROXYTYPE, CURLPROXY_HTTP);
        }	
     
        $response = curl_exec($ch);
     
        $error = curl_error($ch);
        if ( $error != "" )
        {
                $result = $error;
                return $result;
        }	
        $result = curl_getinfo($ch, CURLINFO_HTTP_CODE);
     
        curl_close($ch);
        unset($ch);
     
        return $result;
    }
     
     
    if ( (isset($_POST['NBrequetes'])) && (is_numeric($_POST['NBrequetes'])) && ($_POST['NBrequetes'] != '') ) 
    	$requetes = strip_tags($_POST['NBrequetes']);
    else 
    	$requetes = 5;
     
    $urlspam = array();
    foreach( file('URLspam.csv') as $val )
         array_push( $urlspam, trim( $val ) );
    $nburlspam = count($urlspam);
     
    if (ob_get_level() == 0) ob_start();
     
    if ( (isset($_POST['referer'])) && ($_POST['referer'] != '') ) 
    {   
      $referer = trim(strip_tags($_POST['referer']));
      $boucle = 0;
      while($boucle < $requetes) 
      {
    	for ($i = 0; $i < $nburlspam; $i++)
    	{
    		echo "Spam referrer " . ($boucle+1) . " : referrer = $referer pour le site $urlspam[$i] en cours ...";	
    		sleep(5);
    		$res = CurlSpam('', '', $referer, $urlspam[$i]);
    		if ( $res != '200') 
    		{
    			echo "<br>Erreur : $res<br>";
    			exit();
    		}
    		ob_flush();
    		flush(); 
    	}
    	$boucle++;
      }
    }
     
    ob_end_flush();
     
    ?>
     
    <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
    <html xmlns="http://www.w3.org/1999/xhtml">
    <head>
    <meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
    <title>Spam Referer</title>
    </head>
    <body>
    <h1>Spam Referer</h1>
    <div>
    <form method="POST" action="<?php echo strip_tags($_SERVER['REQUEST_URI']) ;?>">
    <p>URL du site à référencer :</p>
    <input name="referer" type="text" size="100" value="<?php if (isset($_POST['referer']) ) {echo strip_tags($_POST['referer']);} ?>">
    <p>Nombre de requ&ecirc;tes &agrave; effectuer (5 requ&ecirc;tes par défaut)&nbsp;:&nbsp;
    <input name="NBrequetes" type="text" size="3" maxlength="3" value="<?php if (isset($_POST['NBrequetes'])) {echo strip_tags($_POST['NBrequetes']);} else {echo '5';} ?>"></p>
    <p><input type="submit" value="Go" name="go">
    <input type='button' value='Annuler' onclick='location.href="<?php echo strip_tags($_SERVER['REQUEST_URI']) ;?>"'></p>
    </form>
    </div>
    </body>
    </html>
    source: hXXp://www.seoblackout.com/2009/08/24/script-php-spam-referer/

    The SHELL version
    Code:
    #!/bin/bash
    # Variables 
    nbResultat="20"
    header="User-Agent: Mozilla/5.0 (X11; U; Linux i686; fr; rv:1.9.1.1) Gecko/20090715 Firefox/3.5.1
    Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
    Accept-Language: fr,fr-fr;q=0.8,en-us;q=0.5,en;q=0.3
    Accept-Charset: ISO-8859-1;q=0.7,*;q=0.7
    Keep-Alive: 300
    Connection: keep-alive" # Header repris de Firefox pour me faire passer pour un navigateur web (et donc un humain) au lieu d'un bot
    url="http://www.google.fr/search?hl=fr&q=%22Top+*+of+*+Total+Referrers%22+site%3Aedu+inurl%3Ausage_200905.html&btnG=Recherche+Google&meta=&aq=f&oq=" # Sites .edu
    monSite="http://www.google.fr" # Indiquer le nom de votre site qui sera utilisé pour recevoir les backlinks
    tor="127.0.0.1:9050" # Par défaut, adresse du proxy Tor (cf http://doc.ubuntu-fr.org/tor pour l'installation sous ubuntu / debian)
     
    # Variables de boucles et divers
    j=0
     
    curl --cookie-jar cookie.txt --location --silent --header "$header" "http://www.google.fr" 1>/dev/null # Je récupère le cookie Google
    sleep 10
     
    # Je collecte les backlinks
    while [ ! $j = $nbResultat ]
    do
    	myURL=$url"&sa=N&start=$j"
     
    	curl --cookie cookie.txt --silent --location --header "$header" --referer "http://www.google.fr/" "$myURL" -o tmp.html # je récupère les résultats
    	lynx -dump -listonly tmp.html > tmp
    	sed 's/ http/\^http/g' tmp | tr -s "^" "\n" | grep http | grep -v google.fr | grep -v google.com | grep -v localhost | grep -v "search?q=cache" | grep -v "oi=translate" | grep -v youtube.com >> result # Remove google cache and translate result
    	cat tmp.html
    	j=$(( $j + 10 )) # J'incrémente la variable j pour passer de page en page
    	myURL="" # Je réinitialise la variable
    	rm tmp.html tmp
    	sleep 10 # Je mets la boucle en pause pour éviter le captcha Google
    done
     
     
    # Maintenant j'ai mon fichier contenant les backlinks à spammer donc j'attaque :-)
    for backlink in $(cat result); do 
    	curl --socks5 "$tor"--silent --header "$header" --referer "$nomSite" "$backlink" # je récupère les résultats
    	echo "Spam referer du site de la page $backlink en cours..." # Progression du spam referer
    done
     
    rm result cookie.txt
    source: hXXp://www.olivier-tassel.fr/outils/script-spam-referrer


    Final word / warning
    Note that this technique may have the opposit effect: you can get penalties for spam, so don't abuse.


    Please, if you like it, thanks + rep ;)

    Njoy
     
    • Thanks Thanks x 7
    Last edited: Feb 24, 2010
  2. lhurey

    lhurey Newbie

    Joined:
    Sep 30, 2009
    Messages:
    46
    Likes Received:
    219
    do i have to replace the following $ proxy, $ proxyprotocol, $ referer, $ spamsite and upload the script to my server?

    sorry noob here.
     
  3. booster2ooo

    booster2ooo Junior Member

    Joined:
    Jan 17, 2009
    Messages:
    104
    Likes Received:
    30
    The vars you are refering to are the arguments for the function. You need to specify them when you USE the function:
    Code:
    function CurlSpam($proxy, $proxyprotocol, $referer, $spamsite)
     
  4. booster2ooo

    booster2ooo Junior Member

    Joined:
    Jan 17, 2009
    Messages:
    104
    Likes Received:
    30
    Hum, I meant:
    Code:
    $res = CurlSpam('', '', $referer, $urlspam[$i]);
     
  5. james54

    james54 Newbie

    Joined:
    Jan 17, 2010
    Messages:
    15
    Likes Received:
    0
    Does this actually work?

    I mean will these links be usefull?

    Just from browsing through a few domains, I have also seen lots of porn sites already doing this ... :) Amazing - Thanks for the share!
     
  6. booster2ooo

    booster2ooo Junior Member

    Joined:
    Jan 17, 2009
    Messages:
    104
    Likes Received:
    30
    There is only one way to know: give a try. Those links are **************** from .edu/.gov domains so I assume they might be useful. If porn/viagra websites use this technique, It should work. BUT, keep in mind that big G don't like spammers.
     
  7. amb5059

    amb5059 Newbie

    Joined:
    May 29, 2009
    Messages:
    40
    Likes Received:
    42
    This doesn't work anymore... it seems the links in the "referrer" section are no longer links. It's just text.
     
  8. booster2ooo

    booster2ooo Junior Member

    Joined:
    Jan 17, 2009
    Messages:
    104
    Likes Received:
    30
    Some website just list referers as simple text but most of them still use links. e.g.: hXXp://www-k12.atmos.washington.edu/k12/webalizer/usage_201002.html#TOPREFS or hXXp://gisday.sr.unh.edu/reports/usage_201002.html#TOPREFS
    (#3 and #4 for "Top * of * Total Referrers" site:edu inurl:usage_201002.html )
     
  9. richboy

    richboy Junior Member

    Joined:
    Mar 5, 2009
    Messages:
    114
    Likes Received:
    35
    Location:
    Cashville
    I believe this method has been out a looong time. the use of webalizer, awstats, etc.. that's the main concept of PRstorm, which is on here free somewhere.
     
  10. superspiderman

    superspiderman BANNED BANNED

    Joined:
    Nov 12, 2009
    Messages:
    311
    Likes Received:
    62
    Yeah a quick check of even the most obscure .gov domains suggests that this technique has allready been used heaps by porn sites
     
  11. india

    india Junior Member

    Joined:
    Aug 31, 2009
    Messages:
    104
    Likes Received:
    8
    Occupation:
    virtual assistant
    Location:
    India
    Home Page:
    Thanks for great tips. but wouldnt scrapbox will be better?
     
  12. dvishnu

    dvishnu Junior Member

    Joined:
    Dec 8, 2008
    Messages:
    138
    Likes Received:
    145
    Occupation:
    CEO of Bright Bridge
    Location:
    I live in Internet :) i.e World
    huh.

    u r sharing the secret after Webalizer is not giving hyperlink.. I was using it last year.

    in v2.20 they are not giving a hyper link.. they place as just text..

    search any query he given and click first and see the top referred and it will not be as a clickable URL

    anyhow thanks.

    :(
     
  13. Cloaks

    Cloaks Regular Member

    Joined:
    Mar 20, 2010
    Messages:
    298
    Likes Received:
    90
    Thanks a lot. I'm going to program a bot in Java that generates URLs to spam and then spams them. PM me if interested or something.

    EDIT: Ahhh dvishnu is right.. However, now we just have to find out exactly which versions have clickable URLs and then target those. Also, there are other stats systems out there, too, we'll just have to find them!

    By the way: Does it give any backlink juice if my site is on their page, but not in URL form? Any at all?
     
    Last edited: Mar 28, 2010