Preferred method to store PHP arrays (json_encode vs serialize)

前端 未结 20 1895
孤独总比滥情好
孤独总比滥情好 2020-11-22 05:55

I need to store a multi-dimensional associative array of data in a flat file for caching purposes. I might occasionally come across the need to convert it to JSON for use in

相关标签:
20条回答
  • 2020-11-22 06:49

    Really nice topic and after reading the few answers, I want to share my experiments on the subject.

    I got a use case where some "huge" table needs to be queried almost every time I talk to the database (don't ask why, just a fact). The database caching system isn't appropriate as it'll not cache the different requests, so I though about php caching systems.

    I tried apcu but it didn't fit the needs, memory isn't enough reliable in this case. Next step was to cache into a file with serialization.

    Table has 14355 entries with 18 columns, those are my tests and stats on reading the serialized cache:

    JSON:

    As you all said, the major inconvenience with json_encode/json_decode is that it transforms everything to an StdClass instance (or Object). If you need to loop it, transforming it to an array is what you'll probably do, and yes it's increasing the transformation time

    average time: 780.2 ms; memory use: 41.5MB; cache file size: 3.8MB

    Msgpack

    @hutch mentions msgpack. Pretty website. Let's give it a try shall we?

    average time: 497 ms; memory use: 32MB; cache file size: 2.8MB

    That's better, but requires a new extension; compiling sometimes afraid people...

    IgBinary

    @GingerDog mentions igbinary. Note that I've set the igbinary.compact_strings=Offbecause I care more about reading performances than file size.

    average time: 411.4 ms; memory use: 36.75MB; cache file size: 3.3MB

    Better than msg pack. Still, this one requires compiling too.

    serialize/unserialize

    average time: 477.2 ms; memory use: 36.25MB; cache file size: 5.9MB

    Better performances than JSON, the bigger the array is, slower json_decode is, but you already new that.

    Those external extensions are narrowing down the file size and seems great on paper. Numbers don't lie*. What's the point of compiling an extension if you get almost the same results that you'd have with a standard PHP function?

    We can also deduce that depending on your needs, you will choose something different than someone else:

    • IgBinary is really nice and performs better than MsgPack
    • Msgpack is better at compressing your datas (note that I didn't tried the igbinary compact.string option).
    • Don't want to compile? Use standards.

    That's it, another serialization methods comparison to help you choose the one!

    *Tested with PHPUnit 3.7.31, php 5.5.10 - only decoding with a standard hardrive and old dual core CPU - average numbers on 10 same use case tests, your stats might be different

    0 讨论(0)
  • 2020-11-22 06:49

    I would suggest you to use Super Cache, which is a file cache mechanism which won't use json_encode or serialize. It is simple to use and really fast compared to other PHP Cache mechanism.

    https://packagist.org/packages/smart-php/super-cache

    Ex:

    <?php
    require __DIR__.'/vendor/autoload.php';
    use SuperCache\SuperCache as sCache;
    
    //Saving cache value with a key
    // sCache::cache('<key>')->set('<value>');
    sCache::cache('myKey')->set('Key_value');
    
    //Retrieving cache value with a key
    echo sCache::cache('myKey')->get();
    ?>
    
    0 讨论(0)
提交回复
热议问题