问题
The only docs I can find about using GCF+GCS is https://cloud.google.com/functions/docs/tutorials/storage. AFAICT this is just showing how to use GCS events to trigger GCF.
In the docs for GCF dependencies, it only mentions node modules. Is it possible for GCF code to read from a GCS bucket? Is it simply the case of requiring a node module that knows how to communicate with GCS and if so, are there any examples of that?
回答1:
Yes, but note that it will store the result in a ramdisk, so you'll need enough RAM available to your function to download the file.
var storage = require('@google-cloud/storage');
const gcs = storage({projectId: "<your_project>"});
const bucket = gcs.bucket("<your_bucket>");
const file = bucket.file("<path/to/your_file>")
exports.gcstest = (event, callback) => {
file.download({destination:"/tmp/test"}, function(err, file) {
if (err) {console.log(err)}
else{callback();}
})
};
回答2:
Google cloud functions will just execute the code you uploaded. If your code includes libraries to connect to google cloud storage, then, you will be able to connect to google cloud storage, as you will connect to any other api / service.
There are several ways to connect to google cloud storage, like API , oauth, or signed urls ... All these methods are usable on google cloud functions, so I would recommend you have a look at google cloud storage documentation to find the best way for your case.
回答3:
Yes you can read and write to storage bucket
const storage = require('@google-cloud/storage')();
const myBucket = storage.bucket('my-bucket');
const file = myBucket.file('my-file');
file.createReadStream().on('data', () => {
// do something!
});
for more information check the documentations on Google Cloud.
来源:https://stackoverflow.com/questions/49201011/can-a-cloud-function-read-from-cloud-storage