Firebase storage artifacts

前端 未结 5 780
无人及你
无人及你 2020-11-28 13:15

I\' trying to understand what eu.artifacts.%PROJECT NAME%.appspot.com is. It\'s currently taking up 800mb of storage from my daily 5gb limit. It contains only <

相关标签:
5条回答
  • 2020-11-28 13:37

    Adding to @yo1995's response, you can delete the bucket without needing to go into GCP. Staying in Firebase, you go to Storage, then "Add a Bucket". From there, you will see the option to import the gcp and artifact buckets. Next, you can delete the buckets accordingly.

    Artifact Bucket Location Picture

    0 讨论(0)
  • 2020-11-28 13:47

    If you are using Cloud Functions, the files you're seeing are related to a recent change in how the runtime (for Node 10 and up) is built.

    Cloud Functions now uses Cloud Build to create the runtime (for Node 10 and up) for your Cloud Functions. And Cloud Build in turn uses Container Registry to store those runtimes, which stores them in a new Cloud Storage bucket under your project.

    For more on this, also see this entry in the Firebase pricing FAQ on Why will I need a billing account to use Node.js 10 or later for Cloud Functions for Firebase?

    Also see this thread on the firebase-talk mailing list about these artifacts.

    0 讨论(0)
  • 2020-11-28 13:52

    Adding to @yo1995
    I consulted with Firebase Support and they confirmed that the artifacts bucket should not be deleted. Basically the artifacts are used to help build the final image to be stored in the "gcf-sources" bucket.

    To quote them directly
    "you are free to delete the contents in "XX.artifacts", but please leave the bucket untouched, it will be used in the following deployment cycles."

    There might be some unintended behaviour if you delete the artifacts bucket entirely.
    Also "The team is working to clean up this bucket automatically, but there are some restrictions that they need to solve before publishing the solution."

    For the time being I set the bucket to auto-delete files older than 1 day old.

    0 讨论(0)
  • 2020-11-28 13:56

    I've consulted GCP support and here are a few things

    • Cloud Functions caused the surge in storage usage
    • Since these artifacts are not stored in the default bucket, they'll charge you even if your total bytes stored are not reaching the free tier limit
    • Remove the artifact bucket at https://console.cloud.google.com/storage/browser. According to the support staff

    Regarding the artifacts bucket, you can actually get rid of them, as they are storing previous versions of the function. However, I do not recommend deleting the "gcf-sources..." bucket(s) , as it contains the current image, so deleting this bucket would mess up your function.

    I tried to remove it in whole, and so far it is not causing trouble. I'll update if it break things later.


    Edit 201118: See comment below and you might need to keep the bucket while removing all the content in it.

    0 讨论(0)
  • 2020-11-28 13:56

    As an alternative, You can create a life Cycle rule to delete the objects inside the folder. set the age as 1 day. So it will delete all objects in the folder which is more than 1 day aging. lifeCycle rulw

    SetCondition

    0 讨论(0)
提交回复
热议问题