Preferred method to store PHP arrays (json_encode vs serialize)

前端 未结 20 1890
孤独总比滥情好
孤独总比滥情好 2020-11-22 05:55

I need to store a multi-dimensional associative array of data in a flat file for caching purposes. I might occasionally come across the need to convert it to JSON for use in

相关标签:
20条回答
  • 2020-11-22 06:32

    Seems like serialize is the one I'm going to use for 2 reasons:

    • Someone pointed out that unserialize is faster than json_decode and a 'read' case sounds more probable than a 'write' case.

    • I've had trouble with json_encode when having strings with invalid UTF-8 characters. When that happens the string ends up being empty causing loss of information.

    0 讨论(0)
  • 2020-11-22 06:32

    If to summ up what people say here, json_decode/encode seems faster than serialize/unserialize BUT If you do var_dump the type of the serialized object is changed. If for some reason you want to keep the type, go with serialize!

    (try for example stdClass vs array)

    serialize/unserialize:

    Array cache:
    array (size=2)
      'a' => string '1' (length=1)
      'b' => int 2
    Object cache:
    object(stdClass)[8]
      public 'field1' => int 123
    This cache:
    object(Controller\Test)[8]
      protected 'view' => 
    

    json encode/decode

    Array cache:
    object(stdClass)[7]
      public 'a' => string '1' (length=1)
      public 'b' => int 2
    Object cache:
    object(stdClass)[8]
      public 'field1' => int 123
    This cache:
    object(stdClass)[8]
    

    As you can see the json_encode/decode converts all to stdClass, which is not that good, object info lost... So decide based on needs, especially if it is not only arrays...

    0 讨论(0)
  • 2020-11-22 06:33

    I've written a blogpost about this subject: "Cache a large array: JSON, serialize or var_export?". In this post it is shown that serialize is the best choice for small to large sized arrays. For very large arrays (> 70MB) JSON is the better choice.

    0 讨论(0)
  • 2020-11-22 06:33

    First, I changed the script to do some more benchmarking (and also do 1000 runs instead of just 1):

    <?php
    
    ini_set('display_errors', 1);
    error_reporting(E_ALL);
    
    // Make a big, honkin test array
    // You may need to adjust this depth to avoid memory limit errors
    $testArray = fillArray(0, 5);
    
    $totalJsonTime = 0;
    $totalSerializeTime = 0;
    $totalJsonWins = 0;
    
    for ($i = 0; $i < 1000; $i++) {
        // Time json encoding
        $start = microtime(true);
        $json = json_encode($testArray);
        $jsonTime = microtime(true) - $start;
        $totalJsonTime += $jsonTime;
    
        // Time serialization
        $start = microtime(true);
        $serial = serialize($testArray);
        $serializeTime = microtime(true) - $start;
        $totalSerializeTime += $serializeTime;
    
        if ($jsonTime < $serializeTime) {
            $totalJsonWins++;
        }
    }
    
    $totalSerializeWins = 1000 - $totalJsonWins;
    
    // Compare them
    if ($totalJsonTime < $totalSerializeTime) {
        printf("json_encode() (wins: $totalJsonWins) was roughly %01.2f%% faster than serialize()\n", ($totalSerializeTime / $totalJsonTime - 1) * 100);
    } else {
        printf("serialize() (wins: $totalSerializeWins) was roughly %01.2f%% faster than json_encode()\n", ($totalJsonTime / $totalSerializeTime - 1) * 100);
    }
    
    $totalJsonTime = 0;
    $totalJson2Time = 0;
    $totalSerializeTime = 0;
    $totalJsonWins = 0;
    
    for ($i = 0; $i < 1000; $i++) {
        // Time json decoding
        $start = microtime(true);
        $orig = json_decode($json, true);
        $jsonTime = microtime(true) - $start;
        $totalJsonTime += $jsonTime;
    
        $start = microtime(true);
        $origObj = json_decode($json);
        $jsonTime2 = microtime(true) - $start;
        $totalJson2Time += $jsonTime2;
    
        // Time serialization
        $start = microtime(true);
        $unserial = unserialize($serial);
        $serializeTime = microtime(true) - $start;
        $totalSerializeTime += $serializeTime;
    
        if ($jsonTime < $serializeTime) {
            $totalJsonWins++;
        }
    }
    
    $totalSerializeWins = 1000 - $totalJsonWins;
    
    
    // Compare them
    if ($totalJsonTime < $totalSerializeTime) {
        printf("json_decode() was roughly %01.2f%% faster than unserialize()\n", ($totalSerializeTime / $totalJsonTime - 1) * 100);
    } else {
        printf("unserialize() (wins: $totalSerializeWins) was roughly %01.2f%% faster than json_decode()\n", ($totalJsonTime / $totalSerializeTime - 1) * 100);
    }
    
    // Compare them
    if ($totalJson2Time < $totalSerializeTime) {
        printf("json_decode() was roughly %01.2f%% faster than unserialize()\n", ($totalSerializeTime / $totalJson2Time - 1) * 100);
    } else {
        printf("unserialize() (wins: $totalSerializeWins) was roughly %01.2f%% faster than array json_decode()\n", ($totalJson2Time / $totalSerializeTime - 1) * 100);
    }
    
    function fillArray( $depth, $max ) {
        static $seed;
        if (is_null($seed)) {
            $seed = array('a', 2, 'c', 4, 'e', 6, 'g', 8, 'i', 10);
        }
        if ($depth < $max) {
            $node = array();
            foreach ($seed as $key) {
                $node[$key] = fillArray($depth + 1, $max);
            }
            return $node;
        }
        return 'empty';
    }
    

    I used this build of PHP 7:

    PHP 7.0.14 (cli) (built: Jan 18 2017 19:13:23) ( NTS ) Copyright (c) 1997-2016 The PHP Group Zend Engine v3.0.0, Copyright (c) 1998-2016 Zend Technologies with Zend OPcache v7.0.14, Copyright (c) 1999-2016, by Zend Technologies

    And my results were:

    serialize() (wins: 999) was roughly 10.98% faster than json_encode() unserialize() (wins: 987) was roughly 33.26% faster than json_decode() unserialize() (wins: 987) was roughly 48.35% faster than array json_decode()

    So clearly, serialize/unserialize is the fastest method, while json_encode/decode is the most portable.

    If you consider a scenario where you read/write serialized data 10x or more often than you need to send to or receive from a non-PHP system, you are STILL better off to use serialize/unserialize and have it json_encode or json_decode prior to serialization in terms of time.

    0 讨论(0)
  • 2020-11-22 06:35

    Y just tested serialized and json encode and decode, plus the size it will take the string stored.

    JSON encoded in 0.067085981369 seconds. Size (1277772)
    PHP serialized in 0.12110209465 seconds. Size (1955548)
    JSON decode in 0.22470498085 seconds
    PHP serialized in 0.211947917938 seconds
    json_encode() was roughly 80.52% faster than serialize()
    unserialize() was roughly 6.02% faster than json_decode()
    JSON string was roughly 53.04% smaller than Serialized string
    

    We can conclude that JSON encodes faster and results a smaller string, but unserialize is faster to decode the string.

    0 讨论(0)
  • 2020-11-22 06:35

    THX - for this benchmark code:

    My results on array I use for configuration are as fallows: JSON encoded in 0.0031511783599854 seconds
    PHP serialized in 0.0037961006164551 seconds
    json_encode() was roughly 20.47% faster than serialize() JSON encoded in 0.0070841312408447 seconds
    PHP serialized in 0.0035839080810547 seconds
    unserialize() was roughly 97.66% faster than json_encode()

    So - test it on your own data.

    0 讨论(0)
提交回复
热议问题