Which is faster / more efficient - lots of little MySQL queries or one big PHP array?

前端 未结 6 1017
离开以前
离开以前 2021-02-06 01:55

I have a PHP/MySQL based web application that has internationalization support by way of a MySQL table called language_strings with the string_id,

相关标签:
6条回答
  • 2021-02-06 02:04

    My answer is to do something in between. Retrieve all strings for a lang_id that are shorter than a certain length (say, 100 characters). Shorter text strings are more likely to be used in multiple places than longer ones. Cache the entries in a static associative array in get_lang_string(). If an item isn't found, then retrieve it through a query.

    0 讨论(0)
  • 2021-02-06 02:16

    OK - I did some benchmarking and was surprised to find that putting things into an array rather than using individual queries was, on average, 10-15% SLOWER.

    I think the reason for this was because, even if I filtered out the "uncommon" elements, inevitably there was always going to be unused elements as a matter of course.

    With the individual queries I am only ever getting out what I need and as the queries are so simple I think I am best sticking with that method.

    This works for me, of course in other situations where the individual queries are more complex, I think the method of storing common data in an array would turn out to be more efficient.

    0 讨论(0)
  • 2021-02-06 02:16

    Agree with what everybody says here.. it's all about the numbers.

    Some additional tips:

    1. Try to create a single memory array which holds the minimum you require. This means removing most of the obvious redundancies.

    2. There are standard approaches for these issues in performance critical environments, like using memcached with mysql. It's a bit overkill, but this basically lets you allocate some external memory and cache your queries there. Since you choose how much memory you want to allocate, you can plan it according to how much memory your system has.

    3. Just play with the numbers. Try using separate queries (which is the simplest approach) and stress your PHP script (like calling it hundreds of times from the command-line). Measure how much time this takes and see how big the performance loss actually is.. Speaking from my personal experience, I usually cache everything in memory and then one day when the data gets too big, I run out of memory. Then I split everything to separate queries to save memory, and see that the performance impact wasn't that bad in the first place :)

    0 讨论(0)
  • 2021-02-06 02:18

    There isn't an answer that isn't case sensitive. You can really look at it on a case by case statement. Having said that, the majority of the time, it will be quicker to get all the data in one query, pop it into an array or object and refer to it from there.

    The caveat is whether you can pull all your data that you need in one query as quickly as running the five individual ones. That is where the performance of the query itself comes into play.

    Sometimes a query that contains a subquery or two will actually be less time efficient than running a few queries individually.

    My suggestion is to test it out. Get a query together that gets all the data you need, see how long it takes to execute. Time each of the other five queries and see how long they take combined. If it is almost identical, stick the output into an array and that will be more efficient due to not having to make frequent connections to the database itself.

    If however, your combined query takes longer to return data (it might cause a full table scan instead of using indexes for example) then stick to individual ones.

    Lastly, if you are going to use the same data over and over - an array or object will win hands down every single time as accessing it will be much faster than getting it from a database.

    0 讨论(0)
  • 2021-02-06 02:26

    I'm with Fluffeh on this: look into other options at your disposal (joins, subqueries, make sure your indexes reflect the relativity of the data -but don't over index and test). Most likely you'll end up with an array at some point, so here's a little performance tip, contrary to what you might expect, stuff like

    $all = $stmt->fetchAll(PDO::FETCH_ASSOC);
    

    is less memory efficient compared too:

    $all = array();//or $all = []; in php 5.4
    while($row = $stmt->fetch(PDO::FETCH_ASSOC);
    {
        $all[] = $row['lang_string '];
    }
    

    What's more: you can check for redundant data while fetching the data.

    0 讨论(0)
  • 2021-02-06 02:26

    I am currently at the point in my site/application where I have had to put the brakes on and think very carefully about speed. I think these speed tests mentioned should consider the volume of traffic on your server as an important variable that will effect the results. If you are putting data into javascript data structures and processing it on the client machine, the processing time should be more regular. If you are requesting lots of data through mysql via php (for example) this is putting demand on one machine/server rather than spreading it. As your traffic grows you are having to share server resources with many users and I am thinking that this is where getting JavaScript to do more is going to lighten the load on the server. You can also store data in the local machine via localstorage.setItem(); / localstorage.getItem(); (most browsers have about 5mb of space per domain). If you have data in database that does not change that often then you can store it to client and then just check at 'start-up' if its still in date/valid.

    This is my first comment posted after having and using the account for 1 year so I might need to fine tune my rambling - just voicing what im thinking through at present.

    0 讨论(0)
提交回复
热议问题