GCP Cloud Function - ERROR fetching storage source during build/deploy

后端 未结 7 1855
广开言路
广开言路 2021-02-19 12:23

Running into problems building deploying functions. When trying to programmatically deploy the function I get the following output in builder logs (ERRORS).


202         


        
相关标签:
7条回答
  • 2021-02-19 12:49

    I can confirm the issue came with node update to 15.6.0. Running Manjaro, updated my OS, rolling back to node 12 (with nvm) solved the issue!

    0 讨论(0)
  • 2021-02-19 12:51

    I had a similar issue today. It turned out that on one function I was using a timeoutSeconds that exceeded the max of 540 as specified here.

    0 讨论(0)
  • 2021-02-19 12:53

    Had the same error message after spending a day troubleshooting I finally found a way.

    I noticed I had node@15 installed on my PC and node@12 for firebase function.

    All I need do was to

    • install node@12 on my PC
    • unlink the previous node (i.e node@15) and
    • link to the newly installed node@12.

    Boom! it uploaded.

    0 讨论(0)
  • 2021-02-19 12:56

    It has nothing to do with permissions. It must be something else. Maybe google sets limits as to how big cloud function should be.

    0 讨论(0)
  • 2021-02-19 13:04

    As Prodigy mentioned, something must has broken the Cloud Functions deployment in the latest Node v15.6.0 (or another recent one).

    Has the same issue today with firebase deploy. Resolved by rolling back to Node 12 (and I am sure any other more recent version works great).

    0 讨论(0)
  • 2021-02-19 13:10

    I had the same problem deploying cloud functions on my newly started Firebase projects. My local version of node was 15

    For beginners (like me) which were struggling downgrading node on Mac, you can use the following commands:

    sudo npm install -g n
    sudo n 12
    
    0 讨论(0)
提交回复
热议问题