How to upload an in memory file data to google cloud storage using nodejs?

前端 未结 3 873
鱼传尺愫
鱼传尺愫 2020-12-14 10:57

I am reading an image from a url and processing it. I need to upload this data to a file in cloud storage, currently i am writing the data to a file and uploading this file

相关标签:
3条回答
  • 2020-12-14 11:13

    Yes, it's possible to retrieve an image from a URL, perform edits to the image, and upload it to Google Cloud Storage (or Firebase storage) using nodejs, without ever saving the file locally.

    This is building on Akash's answer with an entire function that worked for me, including the image manipulation step.

    Steps

    • Use axios to retrieve a stream of the image from a remote url.
    • Use sharp to make your changes to the image
    • Use Google Cloud Storage Library to create a file, and save the image data to the file in Google Cloud Storage. (more node docs)

    If you are a firebase user using firebase storage, you must still use this library. The firebase web implementation for storage does not work in node. If you created your storage in firebase, you can still access this all through Google Cloud Storage Console. They are the same thing.

    const axios = require('axios');
    const sharp = require('sharp');
    const { Storage } = require('@google-cloud/storage');
    
    const processImage = (imageUrl) => {
        return new Promise((resolve, reject) => {
    
            // Your Google Cloud Platform project ID
            const projectId = '<project-id>';
    
            // Creates a client
            const storage = new Storage({
                projectId: projectId,
            });
    
            // Configure axios to receive a response type of stream, and get a readableStream of the image from the specified URL
            axios({
                method:'get',
                url: imageUrl,
                responseType:'stream'
            })
            .then((response) => {
    
                // Create the image manipulation function
                var transformer = sharp()
                .resize(300)
                .jpeg();
    
                gcFile = storage.bucket('<bucket-path>').file('my-file.jpg')
    
                // Pipe the axios response data through the image transformer and to Google Cloud
                response.data
                .pipe(transformer)
                .pipe(gcFile.createWriteStream({
                    resumable  : false,
                    validation : false,
                    contentType: "auto",
                    metadata   : {
                        'Cache-Control': 'public, max-age=31536000'}
                }))
                .on('error', (error) => { 
                    reject(error) 
                })
                .on('finish', () => { 
                    resolve(true)
                });
            })
            .catch(err => {
                reject("Image transfer error. ", err);
            });
        })
    }
    
    processImage("<url-to-image>")
    .then(res => {
      console.log("Complete.", res);
    })
    .catch(err => {
      console.log("Error", err);
    });
    
    0 讨论(0)
  • 2020-12-14 11:18

    The data can be uploaded without writing to a file by using nodes streams.

    const stream     = require('stream'),
          dataStream = new stream.PassThrough(),
          gcFile     = cloudStorage.bucket(bucketName).file(fileName)
    
    dataStream.push('content-to-upload')
    dataStream.push(null)
    
    await new Promise((resolve, reject) => {
      dataStream.pipe(gcFile.createWriteStream({
        resumable  : false,
        validation : false,
        metadata   : {'Cache-Control': 'public, max-age=31536000'}
      }))
      .on('error', (error : Error) => { 
        reject(error) 
      })
      .on('finish', () => { 
        resolve(true)
      })
    })
    
    0 讨论(0)
  • 2020-12-14 11:29

    You can also upload multiple files:

    @Post('upload')
    @UseInterceptors(AnyFilesInterceptor())
    uploadFile(@UploadedFiles())
        const storage = new Storage();
        for (const file of files) {
            const dataStream = new stream.PassThrough();
            const gcFile = storage.bucket('upload-lists').file(file.originalname)
            dataStream.push(file.buffer);
            dataStream.push(null);
            new Promise((resolve, reject) => {
                dataStream.pipe(gcFile.createWriteStream({
                    resumable: false,
                    validation: false,
                    // Enable long-lived HTTP caching headers
                    // Use only if the contents of the file will never change
                    // (If the contents will change, use cacheControl: 'no-cache')
                    metadata: { 'Cache-Control': 'public, max-age=31536000' }
                })).on('error', (error: Error) => {
                    reject(error)
                }).on('finish', () => {
                    resolve(true)
                })
            })
        }
    
    0 讨论(0)
提交回复
热议问题