Set Google Storage Bucket's default cache control

前端 未结 6 1876
孤城傲影
孤城傲影 2021-01-11 16:41

Is there any way to set Bucket\'s default cache control (trying to override the public, max-age=3600 in bucket level every time creating a new object)

S

6条回答
  •  一生所求
    2021-01-11 17:06

    It is possible to write a Google Cloud Storage Trigger.

    This function sets the Cache-Control metadata field for every new object in a bucket:

    from google.cloud import storage
    
    CACHE_CONTROL = "private"
    
    def set_cache_control_private(data, context):
        """Background Cloud Function to be triggered by Cloud Storage.
           This function changes Cache-Control meta data.
    
        Args:
            data (dict): The Cloud Functions event payload.
            context (google.cloud.functions.Context): Metadata of triggering event.
        Returns:
            None; the output is written to Stackdriver Logging
        """
    
        print('Setting Cache-Control to {} for: gs://{}/{}'.format(
                CACHE_CONTROL, data['bucket'], data['name']))
        storage_client = storage.Client()
        bucket = storage_client.get_bucket(data['bucket'])
        blob = bucket.get_blob(data['name'])
        blob.cache_control = CACHE_CONTROL
        blob.patch()
    

    You also need a requirements.txt file for the storage import in the same directory. Inside the requirements there is the google-cloud-storage package:

    google-cloud-storage==1.10.0
    

    You have to deploy the function to a specific bucket:

    gcloud beta functions deploy set_cache_control_private \
        --runtime python37 \
        --trigger-resource gs:// \
        --trigger-event google.storage.object.finalize
    

    For debugging purpose you can retrieve logs with gcloud command as well:

    gcloud functions logs read --limit 50
    

提交回复
热议问题