Preferred method to store PHP arrays (json_encode vs serialize)

前端 未结 20 1912
孤独总比滥情好
孤独总比滥情好 2020-11-22 05:55

I need to store a multi-dimensional associative array of data in a flat file for caching purposes. I might occasionally come across the need to convert it to JSON for use in

相关标签:
20条回答
  • 2020-11-22 06:36

    I augmented the test to include unserialization performance. Here are the numbers I got.

    Serialize
    
    JSON encoded in 2.5738489627838 seconds
    PHP serialized in 5.2861361503601 seconds
    Serialize: json_encode() was roughly 105.38% faster than serialize()
    
    
    Unserialize
    
    JSON decode in 10.915472984314 seconds
    PHP unserialized in 7.6223039627075 seconds
    Unserialize: unserialize() was roughly 43.20% faster than json_decode() 
    

    So json seems to be faster for encoding but slow in decoding. So it could depend upon your application and what you expect to do the most.

    0 讨论(0)
  • 2020-11-22 06:37

    JSON is simpler and faster than PHP's serialization format and should be used unless:

    • You're storing deeply nested arrays: json_decode(): "This function will return false if the JSON encoded data is deeper than 127 elements."
    • You're storing objects that need to be unserialized as the correct class
    • You're interacting with old PHP versions that don't support json_decode
    0 讨论(0)
  • 2020-11-22 06:42

    I know this is late but the answers are pretty old, I thought my benchmarks might help as I have just tested in PHP 7.4

    Serialize/Unserialize is much faster than JSON, takes less memory and space, and wins outright in PHP 7.4 but I am not sure my test is the most efficient or the best,

    I have basically created a PHP file which returns an array which I encoded, serialised, then decoded and unserialised.

    $array = include __DIR__.'/../tests/data/dao/testfiles/testArray.php';
    
    //JSON ENCODE
    $json_encode_memory_start = memory_get_usage();
    $json_encode_time_start = microtime(true);
    
    for ($i=0; $i < 20000; $i++) { 
        $encoded = json_encode($array);
    }
    
    $json_encode_time_end = microtime(true);
    $json_encode_memory_end = memory_get_usage();
    $json_encode_time = $json_encode_time_end - $json_encode_time_start;
    $json_encode_memory = 
    $json_encode_memory_end - $json_encode_memory_start;
    
    
    //SERIALIZE
    $serialize_memory_start = memory_get_usage();
    $serialize_time_start = microtime(true);
    
    for ($i=0; $i < 20000; $i++) { 
        $serialized = serialize($array);
    }
    
    $serialize_time_end = microtime(true);
    $serialize_memory_end = memory_get_usage();
    $serialize_time = $serialize_time_end - $serialize_time_start;
    $serialize_memory = $serialize_memory_end - $serialize_memory_start;
    
    
    //Write to file time:
    $fpc_memory_start = memory_get_usage();
    $fpc_time_start = microtime(true);
    
    for ($i=0; $i < 20000; $i++) { 
        $fpc_bytes = 
        file_put_contents(
            __DIR__.'/../tests/data/dao/testOneBigFile',
            '<?php return '.var_export($array,true).' ?>;'
        );
    }
    
    $fpc_time_end = microtime(true);
    $fpc_memory_end = memory_get_usage();
    $fpc_time = $fpc_time_end - $fpc_time_start;
    $fpc_memory = $fpc_memory_end - $fpc_memory_start;
    
    
    //JSON DECODE
    $json_decode_memory_start = memory_get_usage();
    $json_decode_time_start = microtime(true);
    
    for ($i=0; $i < 20000; $i++) { 
        $decoded = json_encode($encoded);
    }
    
    $json_decode_time_end = microtime(true);
    $json_decode_memory_end = memory_get_usage();
    $json_decode_time = $json_decode_time_end - $json_decode_time_start;
    $json_decode_memory = 
    $json_decode_memory_end - $json_decode_memory_start;
    
    
    //UNSERIALIZE
    $unserialize_memory_start = memory_get_usage();
    $unserialize_time_start = microtime(true);
    
    for ($i=0; $i < 20000; $i++) { 
        $unserialized = unserialize($serialized);
    }
    
    $unserialize_time_end = microtime(true);
    $unserialize_memory_end = memory_get_usage();
    $unserialize_time = $unserialize_time_end - $unserialize_time_start;
    $unserialize_memory = 
    $unserialize_memory_end - $unserialize_memory_start;
    
    
    //GET FROM VAR EXPORT:
    $var_export_memory_start = memory_get_usage();
    $var_export_time_start = microtime(true);
    
    for ($i=0; $i < 20000; $i++) { 
        $array = include __DIR__.'/../tests/data/dao/testOneBigFile';
    }
    
    $var_export_time_end = microtime(true);
    $var_export_memory_end = memory_get_usage();
    $var_export_time = $var_export_time_end - $var_export_time_start;
    $var_export_memory = $var_export_memory_end - $var_export_memory_start;
    

    Results:

    Var Export length: 11447 Serialized length: 11541 Json encoded length: 11895 file put contents Bytes: 11464

    Json Encode Time: 1.9197590351105 Serialize Time: 0.160325050354 FPC Time: 6.2793469429016

    Json Encode Memory: 12288 Serialize Memory: 12288 FPC Memory: 0

    JSON Decoded time: 1.7493588924408 UnSerialize Time: 0.19309520721436 Var Export and Include: 3.1974139213562

    JSON Decoded memory: 16384 UnSerialize Memory: 14360 Var Export and Include: 192

    0 讨论(0)
  • 2020-11-22 06:43

    JSON is better if you want to backup Data and restore it on a different machine or via FTP.

    For example with serialize if you store data on a Windows server, download it via FTP and restore it on a Linux one it could not work any more due to the charachter re-encoding, because serialize stores the length of the strings and in the Unicode > UTF-8 transcoding some 1 byte charachter could became 2 bytes long making the algorithm crash.

    0 讨论(0)
  • 2020-11-22 06:46

    You might also be interested in https://github.com/phadej/igbinary - which provides a different serialization 'engine' for PHP.

    My random/arbitrary 'performance' figures, using PHP 5.3.5 on a 64bit platform show :

    JSON :

    • JSON encoded in 2.180496931076 seconds
    • JSON decoded in 9.8368630409241 seconds
    • serialized "String" size : 13993

    Native PHP :

    • PHP serialized in 2.9125759601593 seconds
    • PHP unserialized in 6.4348418712616 seconds
    • serialized "String" size : 20769

    Igbinary :

    • WIN igbinary serialized in 1.6099879741669 seconds
    • WIN igbinrary unserialized in 4.7737920284271 seconds
    • WIN serialized "String" Size : 4467

    So, it's quicker to igbinary_serialize() and igbinary_unserialize() and uses less disk space.

    I used the fillArray(0, 3) code as above, but made the array keys longer strings.

    igbinary can store the same data types as PHP's native serialize can (So no problem with objects etc) and you can tell PHP5.3 to use it for session handling if you so wish.

    See also http://ilia.ws/files/zendcon_2010_hidden_features.pdf - specifically slides 14/15/16

    0 讨论(0)
  • 2020-11-22 06:48

    I made a small benchmark as well. My results were the same. But I need the decode performance. Where I noticed, like a few people above said as well, unserialize is faster than json_decode. unserialize takes roughly 60-70% of the json_decode time. So the conclusion is fairly simple: When you need performance in encoding, use json_encode, when you need performance when decoding, use unserialize. Because you can not merge the two functions you have to make a choise where you need more performance.

    My benchmark in pseudo:

    • Define array $arr with a few random keys and values
    • for x < 100; x++; serialize and json_encode a array_rand of $arr
    • for y < 1000; y++; json_decode the json encoded string - calc time
    • for y < 1000; y++; unserialize the serialized string - calc time
    • echo the result which was faster

    On avarage: unserialize won 96 times over 4 times the json_decode. With an avarage of roughly 1.5ms over 2.5ms.

    0 讨论(0)
提交回复
热议问题