1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

How to optimize a large database for fast queries ?

Discussion in 'Other Languages' started by Zak_A, Sep 4, 2012.

  1. Zak_A

    Zak_A Jr. VIP Jr. VIP Premium Member

    Joined:
    Mar 16, 2008
    Messages:
    808
    Likes Received:
    873
    Gender:
    Male
    Occupation:
    WP designer & developer
    Location:
    Western Europe
    I'm currently working on a project that will involve a huge MySQL database.
    By huge, I mean a database with around 100k entries, with an expected total size around 1,2GB.
    Each entry is made of 30ish columns, essentially filled with text (most of them being paragraph sized texts).

    This database will be the core of an API that will essentially serve results to external php scripts, one entry at a time, based on an ID (which will be the primary Index of the MySQL database).
    Thus I'll need these queries to be lightning fast, and I'm afraid the queries could take a few seconds with such a huge DB, which won't be acceptable.

    The API will also need to deliver some results in batches (like 10 - 50 rows) based on others columns, but for these queries it will be ok if it takes a bit longer.

    I know I could just try and see, but I'd rather ask before even building this DB to save some time :)

    Anyway here's my main question before I even build the whole DB:
    - How to build and optimize this DB for fast queries ?
    - MySQL seems obvious to me as I know it quite well and this project is only php based, but will this fit this project ?
    - Is there a recommended structure that will fit best ?
    - Any other recommendations ?

    Thanks guys :)
     
  2. jazzc

    jazzc Moderator Staff Member Moderator Jr. VIP

    Joined:
    Jan 27, 2009
    Messages:
    2,468
    Likes Received:
    10,148
    100k rows is not huge.

    Speed is a factor of database design. Your use cases must be known beforehand. What queries will be performed, is the first thing you must keep in mind. With that, you design your database and indexes. Proper use of indexes is key (pun intended :D) to a fast db.
    After you have a good db design, you should not do silly things in code, like querying 132421 times instead of doing a few JOINs.

    Db optimization is not something that happens afterwards as a patch, you either do it from the beginning or its downhill from there.

    You should google about database design and optimizing indexes for speed and read read read :)
     
    • Thanks Thanks x 1
  3. Zak_A

    Zak_A Jr. VIP Jr. VIP Premium Member

    Joined:
    Mar 16, 2008
    Messages:
    808
    Likes Received:
    873
    Gender:
    Male
    Occupation:
    WP designer & developer
    Location:
    Western Europe
    Thanks for your input jazzc :)

    I'm actually brainstorming the best way to design this DB before building it, and from a few google searches I've realized that my project is indeed not that huge of a DB, so maybe I was worrying a bit too much (especially now that I know that FB is using MySQL too)

    I'm now looking more into indexes optimization and hopefully I'm on the right track to build something that will work seamlessly :)
     
  4. rendesr

    rendesr Junior Member

    Joined:
    Jan 15, 2012
    Messages:
    122
    Likes Received:
    95
    • Thanks Thanks x 1
  5. yeasin33

    yeasin33 Junior Member

    Joined:
    Mar 1, 2012
    Messages:
    114
    Likes Received:
    16
    Occupation:
    Student, Web Developer
    Location:
    1600, Amphitheatre Parkway Mountain View, CA 94043
    You know, When you are searching using int value, this is the fastest possible search. In case, you require to search using varchar, Then you must create a relation table for faster result. First get the int value of the data table through the relation table then search using increment int value. Size does matter.