1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

crawling killing my site?

Discussion in 'Black Hat SEO' started by MusicDisk, Aug 8, 2012.

  1. MusicDisk

    MusicDisk Junior Member

    Joined:
    Jun 29, 2012
    Messages:
    118
    Likes Received:
    2
    Hello i have a big website with around 50 million pages, and google/bing and other search bots killing my site.. its very slower now, takes up to 1 min to load the pages anyone have idea what i can do ? im using Apache, and my site coded in PHP/MYSQL
     
  2. MusicDisk

    MusicDisk Junior Member

    Joined:
    Jun 29, 2012
    Messages:
    118
    Likes Received:
    2
    7898 mysql 20 0 1143m 40m 5932 S 174.6 1.1 3:46.24 mysqld


    mysql slowing up the site :[ anyone can help me , maybe its an bad query willing to pay 100$.. pm me your skype
     
  3. SmartMan

    SmartMan BANNED BANNED

    Joined:
    Jul 25, 2012
    Messages:
    673
    Likes Received:
    1,244
    I believe your website is hosted in dedicated server. Switch to nginx. You can also try Varnish Cache if you want to go with Apache. IMO, you'll get better replies if you posted this on WHT.
     
  4. MusicDisk

    MusicDisk Junior Member

    Joined:
    Jun 29, 2012
    Messages:
    118
    Likes Received:
    2
    yes but mysql used so much cpu.. thats the reason..
     
  5. Zapdos

    Zapdos Power Member

    Joined:
    Oct 22, 2011
    Messages:
    597
    Likes Received:
    708
    Location:
    Eastern North Carolina
    If you're not using a cache system it will very likely have a large hit on CPU for serving the same content over and over (even though it didnt change.) Other tasks that take lots of CPU is hashing and sometimes joins.
     
  6. wEb pOsTS

    wEb pOsTS Registered Member

    Joined:
    Jun 16, 2012
    Messages:
    82
    Likes Received:
    79
    Location:
    Turkey
    try to put the following code in htaccess.
    this code do saving all photos , files .. u have and open them from archives automatically
    in next times without reloading them again
    Code:
    # Turn on Expires and set default to 0 ExpiresActive On 
    ExpiresDefault A0 
    # Set up caching on media files for 1 year (forever?) 
    <FilesMatch "\.(flv|ico|pdf|avi|mov|ppt|doc|mp3|wmv|wav)$"> 
    ExpiresDefault A29030400 
    Header append Cache-Control "public" 
    </FilesMatch> 
    # Set up caching on media files for 1 week 
    <FilesMatch "\.(gif|jpg|jpeg|png|swf)$"> 
    ExpiresDefault A604800 
    Header append Cache-Control "public" 
    </FilesMatch> 
    # Set up 2 Hour caching on commonly updated files 
    <FilesMatch "\.(xml|txt|html|js|css)$"> 
    ExpiresDefault A7200 
    Header append Cache-Control "proxy-revalidate" 
    </FilesMatch> 
    # Force no caching for dynamic files 
    <FilesMatch "\.(php|cgi|pl|htm)$"> 
    ExpiresActive Off 
    Header set Cache-Control "private, no-cache, no-store, proxy-revalidate, no-transform" 
    Header set Pragma "no-cache" 
    </FilesMatch>
    put it in the last of htaccess.
     
  7. Zapdos

    Zapdos Power Member

    Joined:
    Oct 22, 2011
    Messages:
    597
    Likes Received:
    708
    Location:
    Eastern North Carolina
    So instead of killing his site with mysql, that will do it for apache.
    Everytime a request is made, it checks the current directory and higher directories for htaccess and processes each of them. If you have lots of visitors, htaccess can destroy you. Instead if you want to try the above, put it in the apache conf files. That won't have such a large impact as htaccess does.