Firebase cloud function [ Error: memory limit exceeded. Function invocation was interrupted.] on youtube video upload

我的未来我决定 提交于 2019-12-02 18:19:09

问题


I was trying to upload videos to youtube using the firebase cloud function.

What I need is when a user uploads a video to firebase cloud storage, functions.storage.object().onFinalize event will get triggered and in that event, I store the file to a temporary location and upload the file to youtube from the temp location to youtube, after uploading I delete both files.

It will work fine for small files.

But if I upload a large file then the function is getting terminated by showing this error

Error: memory limit exceeded. Function invocation was interrupted.

Code for uploading video

   var requestData = {
        'params': {
        'part': 'snippet,status'
        },
        'properties': {
        'snippet.categoryId': '22',
        'snippet.defaultLanguage': '',
        'snippet.description': "docdata.shortDesc",
        'snippet.tags[]': '',
        'snippet.title': "docdata.title",
        'status.embeddable': '',
        'status.license': '',
        'status.privacyStatus': 'public',
        'status.publicStatsViewable': ''
        }, 'mediaFilename': tempLocalFile
    };

    insertVideo(tempLocalFile, oauth2Client, requestData);

insert video function

function insertVideo( file, oauth2Client, requestData) {
    return new Promise((resolve,reject)=>{
        google.options({ auth: oauth2Client });
        var parameters = removeEmptyParameters(requestData['params']);
        parameters['auth'] = oauth2Client;
        parameters['media'] = { body:  fs.createReadStream(requestData['mediaFilename'])};
        parameters['notifySubscribers'] = false;
        parameters['resource'] = createResource(requestData['properties']);

        console.log("INSERT >>> ");
        let req = google.youtube('v3').videos.insert(parameters,  (error, received)=> {
            if (error) {
                console.log("in error")
                console.log(error);
                try {
                    fs.unlinkSync(file);
                } catch (err) {
                    console.log(err);
                } finally{
                    // response.status(200).send({ error: error })
                }
                reject(error)
            } else {
                console.log("in else")
                console.log(received.data)
                fs.unlinkSync(file);
                resolve();
            }
        }); 
    })

}

code for creating temp local file

           bucket.file(filePath).createReadStream()
            .on('error', (err)=> {
                reject(err)
            })
            .on('response', (response)=> {
                console.log(response)
            })
            .on('end', ()=> {
                console.log("The file is fully downloaded");
                resolve();
            })
            .pipe(fs.createWriteStream(tempLocalFile));

Every file read and write is handled by streams, any idea on why the memory issue is happening


回答1:


The only writeable part of the filesystem in Cloud Functions is the /tmp directory. As per the documentation here:

This is a local disk mount point known as a "tmpfs" volume in which data written to the volume is stored in memory. Note that it will consume memory resources provisioned for the function.

This is why you hit the memory limit with bigger files.

Your options are:

  • Allocate more memory to your function (currently up to 2 GB)
  • Execute the upload from an environment where you can write to filesystem. For example, your Cloud Function could call an App Engine Flexible service to execute the upload.



回答2:


You can also use a resumable video upload following a series of steps:

  1. Your GCS-triggered functions gets triggered when the video finishes the upload.
  2. The function starts a resumable upload session, calculates what are reasonable chunks to upload, and inserts the chunk definitions into pubsub with the range for each chunk and the session id
  3. You create a new pubsub-triggered function with that topic that receives that message, downloads the chunk from GCS using the range header (undocumented on the JSON API, but I already reported it), and uploads the chunk to Youtube

I have not tried, but this might even allow parallel uploads to Youtube from different functions uploading different chunks (which would greatly improve performance, although the docs suggest that the chunks need to be uploaded in order). You can download an arbitrary chunk from a GCS object, so the GCS side of things are not a problem for parallelization.

If parallel uploads are not allowed, you can just insert a new pubsub message when a functions finishes uploading it's chunk with the last byte uploaded, so the execution of functions is ordered (while it allows for parallel uploads of different videos).

This is a little more involved, but allows you to upload arbitrary-sized videos (up to the current 128 GB limit on Youtube) from small functions.

Take care to handle failures properly (maybe re-inserting the chunk into the pubsub topic).



来源:https://stackoverflow.com/questions/53285089/firebase-cloud-function-error-memory-limit-exceeded-function-invocation-was

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!