I can\'t find any examples on how to use the python google cloud storage\'s batch functionality. I see it exists here.
I\'d love a concrete example. Let\'s say I want
TL;DR - Just send all the requests within the batch() context manager (available in the google-cloud-python
library)
Try this example:
from google.cloud import storage
storage_client = storage.Client()
bucket = storage_client.get_bucket('my_bucket_name')
# Accumulate the iterated results in a list prior to issuing
# batch within the context manager
blobs_to_delete = [blob for blob in bucket.list_blobs(prefix="my/prefix/here")]
# Use the batch context manager to delete all the blobs
with storage_client.batch():
for blob in blobs_to_delete:
blob.delete()
You only need to worry about the 100 items per batch if you're using the REST APIs directly. The batch() context manager automatically takes care of this restriction and will issue multiple batch requests if needed.