1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Getting Subdomains On The Fly Indexed Faster - A PHP Tutorial

Discussion in 'Black Hat SEO' started by advancedfuture, Dec 29, 2009.

  1. advancedfuture

    advancedfuture Junior Member

    Oct 18, 2008
    Likes Received:
    So i've been setting up a campaign using sub domains on the fly. and I wanted to get them indexed a LOT faster. Having several hundred sub domains makes creating sitemaps a pain in the ass to do manually, not to mention setting each sub domain manually in webmaster tools is also a pain.

    This is how I setup my server, all pages are dynamically generated content that is pulled from the database to the directory structure is pretty simple.

    Now first thing I did was make it so my .XML and .TXT files are processed as PHP by adding these two lines to my .htaccess.

    AddType application/x-httpd-php .txt
    AddType application/x-httpd-php .xml
    This sets us up so we can process these files as PHP. The reason we are doing this is because in our Robots.txt file we want to use the Sitemap: feature but this requires a fully qualified domain name. So now that our robots.txt will process PHP we create the file below.

    Now we can make our robots.txt

    	echo 'User-agent: *'."\n";
    	echo 'Sitemap: http://'.$_SERVER['SERVER_NAME'].'/sitemap.xml'."\n";
    	echo 'Disallow: /join.php'."\n";
    	echo 'Disallow: /confirm.php'."\n";
    	echo 'Disallow: /admin/'."\n";
    	echo 'Disallow: /payments/'."\n";
    	echo 'Disallow: /template/';
    What this does is tells the server to look for our sitemap.xml file on our whatever domain we enter. Now for the icing on the cake we need to generate a XML sitemap for each subdomain we are trying to get pages indexed on. Since XML will now process PHP we create our sitemap file.

    Below is an example of my sitemap.xml file, what the file does in my case is I grab url variables from the subdomain and lookup the corresponding data in my database an spit our all the URLs.

    	include 'dbconnect.php';
    	//Get what subdomain we are on. strip the county and state variables
    	//from the URL, and if it is www, set it to Humboldt County, CA
    	$domain = explode('.',str_replace("mydomain.com"," ",$_SERVER['SERVER_NAME']));
    	$county = $domain[0];
    	$state = $domain[1];
    	//If we are on the root of the domain, set the variables to Humboldt, CA
    	if ($county == "www")
    		$county = "humboldt";
    		$county_slash = ucwords(strtolower(str_replace("-"," ",$county)));
    		$state = "ca";
    	//Include Major Functions to Control Meta Text, Meta Description, Page Titles, Page Content 
    	include 'functions.inc.php';
    	//This variable is used for displaying the county without the slashes
    	$county_slash = ucwords(strtolower(str_replace("-"," ",$county)));
            //Generate the XML code for each subdomain that is requested in this file.
    	header("Content-Type: txt/xml");
    	echo '<?xml version="1.0" encoding="UTF-8"?>';
    	echo '<urlset
    	$query = "SELECT DISTINCT city FROM zip_code WHERE county='$county_slash' AND state_prefix='$state' ORDER BY city ASC";
    	$results = mysql_query($query);
    	while($row = mysql_fetch_array($results))
    		$city = str_replace(" ","-",$row['city']);
    		$query2 = "SELECT productname FROM sidenav_products ORDER BY productname ASC";
    		$results2 = mysql_query($query2);
    		while($row2 = mysql_fetch_array($results2))
    			$product = str_replace(" ","-",$row2['productname']);
    			echo '<url>';
    			echo '<loc>http://'.$county.'.'.$state.'.mydomain.com/products/'.$city.'/'.$product.'</loc>';			
    			echo '</url>';
    	echo '</urlset>';
    Anyways its a general idea for everyone that is playing around with sub domain spamming. You can use this to generate unique site maps and robots.txt files so you can have all your sub domain pages indexed much faster by the Big 3. This eliminates the need to setup individual sitemaps for each subdomain and individual robots.txt files, and also eliminates the need to submit your sitemap manually for each subdomain. (this does only work if your site structure is the same for all subdomains) Note: You may have to custom tailor these things depending on your individual application, but it works really really well for me.
    • Thanks Thanks x 1
    Last edited: Dec 29, 2009