1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

7 Steps to Web Server Health

Discussion in 'White Hat SEO' started by barsha, Jun 25, 2008.

  1. barsha

    barsha Registered Member

    Apr 9, 2008
    Likes Received:
    06/17/2008 4:32 PM Quote Reply Alert
    7 Steps to Web Server Health

    Your web server is probably the one area of your marketing campaign which gets ignored the most. Marketers often don?t consider the server as a vital part of the process. But it is. According to my testing, you must have the right setup to receive maximum value from your SEO and SEM efforts.

    Let?s begin.

    1 - Update Your Server Software

    The upgrade can usually occur without causing too much disruption and will generally take about two hours to complete with at least one server restart. Do yourself a favor and keep your server updated to ensure it is running at its optimum level.

    If you are on a shared hosting platform, get off of it. Get your own dedicated server now, not later. Your host may say, ?You don?t want your own server, you?ll have to manage it yourself.?

    Is that really true or is it because they can jam about 20,000 domains on one server. And if they charge $10.00 a month for hosting, that is $20,000.00 in revenue. However, a dedicated server runs about $200.00 a month for them. See the difference? Oh, and you don?t have to manage it yourself either, just get a killer admin. I recommend Easy Server Management. It will run you $99 a month. That?s it.

    Recommended Hosts:

    LiquidWeb.com; Pair; Pugmarks.net

    2 - Verify Your IP Addresses are Clean

    How? To do a full and complete check on any IP address, you want to check the main sources of 'black lists'.

    You never want to do the following:

    You: 'Are all of these IP addresses you just assigned me clean?'

    Host: 'Yes, of course. We check and cross check all the IP addresses before assigning them.'

    You: 'Great, thanks!'

    If you do the above, you will be committing a cardinal webmaster sin. Never take their word for it. Trust, but verify. This is YOUR business. I get assigned ?dirty? IPs from The Planet on a regular basis which is why they are NOT on the recommended list above.

    To verify the IP addresses assigned to you are clean, use the following tools:

    Black List Alert - Enter your Domain or IP address along with the Captcha and see the results from many of the top black lists.

    Open RBL ? Checks against the most popular blacklists. Once the check is complete Open RBL will give you a quick overview of each blacklist and the status of the IP in question for each.

    Because Open RBL doesn?t check against every blacklist, there are a few good separate blacklist checkers to review:


    The above tend to be 'trigger happy' when blacklisting IPs.

    3 ? Secure Your Server

    Every server has vulnerabilities, just some are easier to exploit than others. In 2007, I survived a brute force server attack against one of the servers I had for five plus years. This was the first successful entry and I had the host and the Feds calling me and making threats. Ouch!

    X-Site Scripting, or Cross Site Scripting as it is known, was used.

    There were four reasons why this happened:

    I was running a severely out of date control panel. An update would have required a complete reformat of the drive and I didn't want to invest the time to re-setup all the accounts and databases. I was lazy and I paid for it.
    I got lazy and assigned the username "admin" to one of the accounts.
    I used a password that was found in the dictionary, which is dumb!
    I was not monitoring the server by checking the error log files. If I had, I would have noticed the large file size.
    Here?s how to protect your server:

    Never use the following for user names: admin, user, host.
    For passwords, never use terms in the dictionary, names, birthdays, events, predictable number sequences, etc. If you?re stuck generating random passwords, use PC Tools Secure Password Generator
    Make sure you are using the latest version of your control panel. If you don't know, ask your host. If it is old, upgrade.
    Even the latest security patches can't fully protect you. Make sure you change your passwords on a monthly basis.
    Keep all your scripts up-to-date.
    If you use Joomla, it has the most vulnerable code for exploits in the CMS arena. They regularly post security updates, but you have to install them in order to protect your server. This doesn't mean that you shouldn't use Joomla; you just need to verify (at least on a weekly basis) that you have the latest security updates installed.
    Get a reliable server admin to install security updates and do the "little things" to make sure your server is protected.
    4 - .htaccess File (Apache Server)

    Using the .htaccess file to its capacity is a good way to decrease the load that is weighing down on your server.

    Non-www Protection: Every webmaster should have the non-www protection on their site. It will ensure you are not hit by Google?s Internal Duplicate Content Penalty, which causes your site to be discounted.

    How it Happens: Let?s say that Site B links to Site A without the ?www? ? http://sitea.com/. When Google?s spider hits the link, it will index the non-www version of the page. The result: Google will have two copies of the page in their index and will view them as coming from two different sites.

    Note: If the site?s internal linking structure uses relative links instead of absolute links, then the issue becomes site wide.

    Both versions of the page/site will be discounted and may be de-indexed. This "Slow Site Death" can be painful to watch ? especially if it is your site.

    The Fix: This cannot be fixed by simply going into your Google Webmaster account (Dashboard > Tools > Set preferred domain).

    A lot of webmasters think this fixes the problem, but it only applies to how the domain appears in the SERPS visually. It has nothing to do with how Google indexes your site or how your site appears in a browser.

    Here is the real fix:

    1) Find the .htaccess file on your server. You can usually find it in the ?root? (where your home page is). If you can't locate it, try right-clicking in the window showing the files on your server and choose "Filter". Find the area designated as "Remote Filter" and add the "-a".

    2) Save a copy of the file locally.

    3) Add the code (in grey below). Be aware that notepad often corrupts the file and the default method of saving the file is as a ?txt? file, which will not work. I prefer to edit the file on the server.

    RewriteEngine On
    RewriteBase /
    RewriteCond %{HTTP_HOST} !^http://www.stompernet.com [NC]
    RewriteRule ^(.*)$ http://www.stompernet.com/$1 [L,R=301]

    If you have sub domains, you want the following code.

    RewriteEngine On
    RewriteBase /
    RewriteCond %{HTTP_HOST} ^ stompernet.com [NC]
    RewriteRule ^(.*)$ http://www.stompernet.com/$1 [L,R=301]

    What if you want the exact opposite? You want the www URLs to redirect to the non-www version? Use the code below:

    RewriteEngine on
    RewriteCond %{HTTP_HOST} ^http://www.stompernet.com
    RewriteRule ^(.*)$ http://stompernet.com/$1 [R=301,L]

    4) Swap out my domain for yours.

    Save and upload the file.

    Open a browser and type "yourdomain.com" without the "www" and it should redirect you to ?http://www.yourdomain.com?. If you still have problems, call your host and make sure the Apache Rewrite Module is turned on.

    For Window servers: Installing non-www Domain Redirect on an IIS Server

    Prevent ?Hotlinking?:Hotlinking is the hijacking of an image, JavaScript, CSS, etc. The abusing webmaster serves your content/images without having to use their bandwidth.

    To prevent hotlinking, use this code in your .htaccess file:

    RewriteEngine on
    RewriteCond %{HTTP_REFERER} !^$
    RewriteCond %{HTTP_REFERER} !^http://stompernet.com.*$ [NC]
    RewriteCond %{HTTP_REFERER} !^http://www.stompernet.com.*$ [NC]
    RewriteRule .(gif|jpg|png|js|css)$ - [F]

    Those who attempt to hotlink will be blocked or served different content. Be careful if you decided to do what Jake Baillie did, swap out images for porn. You may have angry webmasters calling you, or worse, their attorneys.

    Blocking Bad Bots: The definition of a "bad bot" is a spider or program that takes more than it gives. One example is Ask.com, which takes a ton of bandwidth, but you get very few referrals from their search engine. In all cases, bad bots raise your bandwidth and resource usage.

    Mike Grehan, at WebmasterWorld 2006, stated because bad bots never obey the robots.txt file, you should use the .htaccess file to deny the bad bots.

    The tricky part comes when trying to identify the bad bot. The following is code that will deny bots from your server:

    RewriteEngine On

    RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR]

    RewriteCond %{HTTP_USER_AGENT} ^EmailSpider [OR]

    RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR]

    RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR]

    RewriteRule ^.* - [F,L]

    5 ? Reduce Size of Images

    One of the best ways to improve load time and decrease bandwidth is to reduce the size of your graphics. The best tool out there is OptiView's Site Scan Service. I use the Professional version ($80.00 a year). They work on a ?token? system, so you are limited to the amount of images you can optimize online. However, if you optimize your graphics from your hard drive, you can optimize an unlimited amount.

    What's killer about OptiView?s Site Scan Service is they give you 3-4 different versions of the graphic with the percentage of reduction so you know how much you will save in load time. I suggest you use the first or second as the further down you go the worse the resolution is.

    Let's do an example, using my site.

    As you can see, I have a total of 41 images on the home page totaling about 171K. That's a pretty heavy load. So, I reviewed all the graphics OptiView pegged as being "bloated" and optimized 23 of 41. Here's the difference:

    It dropped to 124K, a 47K (27%) difference. Along with adding the Expires Header (discussed later) will make a huge difference to your server.

    6 ? Install and Use YSlow

    The main problem with most websites is the response time between "request sent" (from the server) and the "data received" (by the web browser) can be high. So high at times that the page is either non-responsive or it loads incorrectly. Using YSlow, you can improve response time by 20-50%.

    Note: This not only will help you improve your websites, but if you consult or have clients, it will allow you to get to the root of many issues quickly and, in so doing, impress your client or prospect immediately.

    What is YSlow? It is a new developer tool from Yahoo! and it works with FireBug, which is integrated with FireFox. It will NOT work with IE.

    1. Install Firebug.

    2. Install YSlow.

    3. Look at the bottom right of your browser window. You should now have the highlighted portion of the image below visible.

    The green checkmark means the site is "okay". If there were a problem, a "red x" would appear and it would indicate how many errors there were.

    The letter "F" refers to the letter grade the site receives overall. It is based on the grade system, so ?F? means ?failure?. Each of the 13 rules below is weighted according to importance. YSlow then analyzes the page, deducts points for each infraction, applies a grade to each rule and gives an overall grade and score for the page. And finally, it posts the total size of the site.

    NOTE: Before making ANY changes to your actual site, take a backup of your site, and then make a test copy of your home page and work on that first. That way, if you screw up, it won't affect your home page.

    According to testing there isn?t value in all 13 of the recommendations. I will go over each one, what it means and what you should do.

    Making fewer HTTP Requests: With all the CSS, JavaScript, iFrames, Flash, third party scripts, etc., HTTP requests can get pretty heavy. Look to combine JavaScripts and take a strong look if the CSS backgrounds are really needed in your design.

    Below are some common ?web design myths" that are still believed:

    Myth: "Slice" up a large image into smaller images.
    Instead: Optimize the large image and use an Image Map for any hyperlinks. Instead of many server requests for the image, there will be one. Over 80% of users are on broadband, so the issue here is not load time, it is the number of requests.

    Myth: Use many smaller scripts instead of one large one.
    Instead: Having multiple server requests is a bad idea. Combine the smaller scripts into a single script to make just one request.

    Recommendation: Get your HTTP requests reduced. Work on this section until you have an "A".

    2) Using a Content Delivery Network (CDN): According to Yahoo!, deploying content across multiple, geographically dispersed servers can make your pages load faster from the user's perspective. Personally, I don't see a need of having a CDN. They are only needed for extremely large websites. To me, it makes more sense to have a dedicated server close to an OC3 so there are as few "hops" as possible when your server is accessed.

    Recommendation: Ignore this section.

    3) Adding an Expires Header: The Expires Header lets the browser know how long it can hold the cache from your site to give to repeat visitors instead of downloading everything again. I can hear you now, "Fine Jerry, but browsers do that already, so why do I need this?" This prevents browsers from flushing the cache on their own.

    I got an ?F? in this area for the Web Marketing Now home page mainly because I wasn't using the header. Testing provided the code below:

    Header set Expires "Wed, 1 Oct 2008 22:00:00 GMT"

    A few things you want to make sure you know about the above:

    The extensions listed in the "Files Match" line are the ones that had issues in the report. If you have other extensions besides the ones listed above, add them.
    You can set the date to any date you wish. YSlow just needs it at least 48 hours and 1 second in the future.
    If you get a Server Error 500 or your site becomes unresponsive from adding the code, remove it and inform your admin or your host that your server needs an Apache update.
    Okay, so after adding this code, what happens to my grade? I earned an ?A? in this section and jumped from an F (59) to a B (80) overall.

    4) Gzip Components: Yahoo! states using GZip will reduce load time by 70%. If you don't know already, Gzip is a software application used for file compression using the GZip algorithm. According to my testing, the time saved in load time was measured in milliseconds. Not seconds, but milliseconds. I found no significant savings in load time.

    If you want to use GZip on your site, use the following:

    Apache 1.3

    mod_gzip on (this code enables mod_gzip on your domain)

    In my testing, the real savings in load time was by compressing JavaScript and CSS. Here is the code:

    mod_gzip_item_include file .js$
    mod_gzip_item_include mime ^application/x-javascript$
    mod_gzip_item_include file .css$
    mod_gzip_item_include mime ^text/css$

    For more detailed information, here is a good resource I found useful as a reference for my testing.

    Apache 2.x

    Adding the GZip module is a lot easier, just one line:

    AddOutputFilterByType DEFLATE text/html text/css application/x-javascript

    Here is the direct reference located at the Apache site.

    This is optional as there was only a little load time saved.

    5) Put CSS at the Top: Testing confirms moving CSS to the document HEAD allows pages load faster because it allows the page to render progressively. Putting the CSS at the bottom of the page stops the browser's ability to progressively load the page. In fact, IE will show a blank page until the entire document loads. Firefox will force a screen redraw upon load completion. This often appears as an annoying flash of the screen.

    I do recommend that you move CSS to the Head section of the page.

    6) Put JS at the Bottom: This is for the same purpose as #5. You want your JavaScript to appear at the bottom of the page to assist in the progressive rendering of the page in the browser.

    Also, scripts block parallel downloads. The HTTP/1.1 specification suggests that browsers download no more than two components in parallel per hostname. If you serve your images from multiple hostnames, you can get more than two downloads to occur in parallel. However, while a script downloads the browser won't start any other downloads, regardless if they reside on different hostnames.

    In some cases it's not easy to move scripts. If, for example, the script uses document.write to insert part of the page's content, it can't be moved lower. My testing showed there wasn't a strong need to do this. However, if you are loading a lot of stuff, this could "load balance" the page. This is optional.

    7) Avoid CSS Expressions: CSS expressions are a powerful ? and a dangerous - way to set CSS properties dynamically. As an example, the background color could be set to alternate every hour using expressions.

    background-color: expression( (new Date()).getHours()%2 ? "#B8D4FF" : "#F08A00" );

    This as optional as CSS expressions aren?t often used.

    8) Making JavaScript and CSS External: If you have the same JavaScript for a menu and the same CSS on each page, not only is that taking up space on your server, your transfer rate will be higher than it needs to, and you are cluttering up Google's index. Send JavaScript and CSS to external files.

    While external CSS and JS files cause more HTTP requests, they are cachable, allow for smaller HTML files overall and allows for the search engine bots to index the pages faster. For almost every site I have visited, YSlow posts an "n/a" under this section.

    I highly recommend CSS and JavaScript as external files and then exclude those folders in your robots.txt file so Google doesn?t index them.

    9) Reduce DNS Lookups: The Domain Name System (DNS) maps hostnames to IP addresses, just as phonebooks map people's names to their phone numbers. It typically takes 20-120 milliseconds for the DNS to deliver the IP address. The browser can't download anything until it is completed.

    I highly recommend that you work to get an ?A? in this section and limit the DNS lookups to not only lower your server load, but also increase usability and visibility through your PPC traffic. If you do, you will be glad you did.

    10) Minify JavaScript: To minify, remove unnecessary characters from the code to reduce the size of the script and improve load times. You can remove all comments, unneeded white space characters (space, newline, and tab), etc. It is safe and fairly straightforward. Two popular tools for minifying JavaScript code are JSMin and YUI Compressor.

    If you gzip your scripts, minifying them will still reduce the size by 5%.

    A side benefit is if someone hijacks your minified script, it will be harder to read and edit for someone who isn?t an expert in JavaScript.

    In my testing, while it was easy to do, it didn't give any real measurable advantages. This as an optional step.

    11) Avoid Redirects: YSlow states redirects slow down the user experience. Please. Don't buy into statements like that. I test redirects all day long to ensure there are no delays. Just remember one rule: If you do a redirect, ensure the back button works correctly. Reference on Redirects from W3C

    In my testing, it is better to ignore the grade here and focus instead on warnings given by Google in their Webmaster Tools area.

    12) Removing Duplicate Scripts: Duplicate scripts means duplicate HTTP requests if the script is external and not cacheable. If it is cacheable, extra HTTP requests occur when the user reloads the page.

    One way to avoid accidentally including the same script twice is to implement a script management module in your templating system. The script tag is used for HTML pages:

    script type="text/javascript" src="menu_1.0.17.js">

    An alternative in PHP would be to create a function called insertScript:

    I highly recommend ensuring your pages are free of duplicate scripts.

    13) Configure ETags: Entity tags (ETags) are a mechanism that web servers and browsers use to determine whether the component in the browser's cache matches the one on the server. ETags are more flexible than the last-modified date. Here is an example:

    HTTP/1.1 200 OK
    Last-Modified: Tue, 12 Dec 2007 03:03:59 GMT
    ETag: "10c24bc-4ab-457e1c1f"
    Content-Length: 12195

    The problem with ETags is that they are typically constructed using attributes that make them unique to a specific server hosting a site. Meaning when the browser gets the original component from one server and later tries to validate that component on a different server the Etags won?t match. By default, both Apache and IIS embed data in the ETag that dramatically reduces the odds of the validity test succeeding with multiple servers.

    According to my testing, it takes A LOT of work to properly configure and reconfigure ETags properly. Removing them is easier and will also speed up your server. Here is the code:

    Header unset ETag
    FileETag None

    That's the rundown on YSlow. Nearly 100 hours of research provided in a few small recommendations. You can probably go through all of this in about an hour, start to finish.

    Note: If you have issues with third-party scripts or images with YSlow, it can be as simple as updating your vendor code. When in doubt contact them for updated code to comply with YSlow.

    7 ? Maintain

    As with a car that is not maintained, a server left to itself may result in business breakdown. Maintaining an optimal server which serves valid information to the search engines for indexing and will load your site quickly and efficiently for your users is vital. The payoff is increased usability of your site and may boost conversions through organic and paid search.

    Check List

    Update Server
    Verify IP Address is Clean
    Secure Your Server
    Create and Maintain .htaccess File
    Non-www Protection
    Prevent Hotlinking
    >Block Bad Bots
    Reduce Size of Images
    Install and Use YSlow
    Make Fewer HTTP Requests - Recommended
    Use a Content Delivery Network - Optional>
    Add an Expires Header - Recommended
    Gzip Components - Optional
    Put CSS at the Top - Recommended
    Move Scripts to the Bottom - Optional
    Avoid CSS Expressions - Optional
    Make JavaScript and CSS External - Recommended
    Reduce DNS Lookups - Recommended
    Minify JavaScript - Optional
    Avoid Redirects - Don't Break the Back Button...
    Remove Duplicate Scripts - Recommended
    Configure/Remove ETags - Recommended

    Best Regards,

    Jerry West
    • Thanks Thanks x 9
  2. typo619

    typo619 Newbie

    May 16, 2008
    Likes Received:
    Jerry great post. Thanks for schooling me on way of the server.
  3. marketraise

    marketraise Newbie

    May 7, 2008
    Likes Received:
    thank you very , for this informative post .
  4. hotdogs

    hotdogs Newbie

    Jun 30, 2008
    Likes Received:
    Thanks! This has already helped me out greatly and cut my server load by nearly 50%! :)
  5. Cleaner007

    Cleaner007 Newbie

    Jul 10, 2008
    Likes Received:
    If you don't know how to http://www.moleskinsoft.com/ use Clone Remover. It's the best variant. I've been using for 5 months.
  6. anilkumarpandey

    anilkumarpandey Newbie

    Apr 25, 2010
    Likes Received:
    Thanks Jerry.Very Informative On how to minimize page load time.
  7. mazgalici

    mazgalici Supreme Member

    Jan 2, 2009
    Likes Received:
    Home Page:
    Thanks, I think you have spent a least an hour to write this. I really apreciate it!
  8. Blog Rider

    Blog Rider Registered Member

    Jun 14, 2011
    Likes Received:
    Thanks for sharing your experience with thisw. Feds really called you ? wow. I just started with VPS so this information is very useful.