Cloud Functions for Firebase killed due to memory limit exceeded

前端 未结 10 1784
天涯浪人
天涯浪人 2021-01-31 01:53

I keep getting a sporadic error from Cloud Functions for Firebase when converting a relatively small image (2mb). When successful, the function only takes about 2000ms or less t

10条回答
  •  执念已碎
    2021-01-31 02:37

    It seems the default ImageMagick resource config in Firebase Cloud Functions doesn't match the actual memory allocated to the function.

    Running identify -list resource in the context of a Firebase Cloud Function yields:

    File       Area         Memory        Map       Disk   Thread  Throttle       Time
    --------------------------------------------------------------------------------
     18750    4.295GB       2GiB       4GiB  unlimited        8         0   unlimited  
    

    The default memory allocated to a FCF is 256MB - the default ImageMagick instance thinks it has 2GB and therefore doesn't allocate buffer from disk and can easily try to over allocate memory causing the function to fail on Error: memory limit exceeded. Function killed.

    One way is to increase required memory as suggested above - although there's still risk IM will try to over allocate depending on your use case and outliers.

    Safer yet would be to set the correct memory limit to IM as part of the image manipulation process using -limit memory [your limit]. You can figure out your approx memory usage by running your IM logic with `-debug Cache' - it will show you all the buffers allocated, their sizes and if they were memory or disk.

    If IM hits the memory limit it will start allocating buffers on disk (memory mapped and then regular disk buffers.You'll have to consider your specific balance between I/O performance vs memory cost. Price of every additional byte of memory you allocate to your FCF is multiplied by 100ms of usage - so that can grow quickly.

提交回复
热议问题