untarring files to S3 fails, not sure why

后端 未结 2 497
一个人的身影
一个人的身影 2021-01-07 03:53

(new information below) I am trying to set up a lambda function that reacts to uploaded tgz files by uncompressing them and writing the results back to S3. The unzip and unt

相关标签:
2条回答
  • 2021-01-07 04:25

    Your body variable is a Stream object, in which case you will need to use .toString()

    var aws = require('aws-sdk');
    var s3 = new aws.S3({apiVersion: '2006-03-01'});
    var zlib = require('zlib');
    var tar = require('tar');
    var fstream = require('fstream');
    
    fstream.Reader({'path': 'testdata.tar.gz'})
        .pipe(zlib.Unzip())
        .pipe(tar.Parse())
        .on('entry', function(entry) {
            var filename = entry.path;
            console.log('got ' + entry.type + ' ' + filename);
            if (entry.type == 'File') {
                if (1) { // switch between working and nonworking cases
                    s3.upload({Bucket: 'my_bucket', Key: 'gunzip-test/' + filename, Body: entry.toString()}, {},
                              function(err, data) {
                                  if (err) 
                                      console.log('ERROR!');
                                  else
                                      console.log('OK');
                              });
                }
                else {
                    entry.pipe(fstream.Writer({ 'path': '/tmp/mytest/' + filename }));
                }
            }
        });
    
    0 讨论(0)
  • 2021-01-07 04:31

    In my case running the stream through stream.PassThrough helped.

    var PassThrough = require('stream').PassThrough;
    
    var stream = getStreamSomeHow();
    var passthrough = new PassThrough();
    
    stream.pipe(passthrough);
    
    s3.upload({...,Body:passthrough}) // 
    
    0 讨论(0)
提交回复
热议问题