Node.js POST File to Server

后端 未结 1 1511
遇见更好的自我
遇见更好的自我 2021-02-10 01:43

I am trying to write an app that will allow my users to upload files to my Google Cloud Storage account. In order to prevent overwrites and to do some custom handling and loggi

1条回答
  •  太阳男子
    2021-02-10 02:04

    I believe that if you want to do POST, you have to use a Content-Type: multipart/form-data;boundary=myboundary header. And then, in the body, write() something like this for each string field (linebreaks should be \r\n):

    --myboundary
    Content-Disposition: form-data; name="field_name"
    
    field_value
    

    And then for the file itself, write() something like this to the body:

    --myboundary
    Content-Disposition: form-data; name="file"; filename="urlencoded_filename.jpg"
    Content-Type: image/jpeg
    Content-Transfer-Encoding: binary
    
    binary_file_data
    

    The binary_file_data is where you use pipe():

    var fileStream = fs.createReadStream("path/to/my/file.jpg");
    fileStream.pipe(requestToGoogle, {end: false});
    fileStream.on('end, function() {
        req.end("--myboundary--\r\n\r\n");
    });
    

    The {end: false} prevents pipe() from automatically closing the request because you need to write one more boundary after you're finished sending the file. Note the extra -- on the end of the boundary.

    The big gotcha is that Google may require a content-length header (very likely). If that is the case, then you cannot stream a POST from your user to a POST to Google because you won't reliably know what what the content-length is until you've received the entire file.

    The content-length header's value should be a single number for the entire body. The simple way to do this is to call Buffer.byteLength(body) on the entire body, but that gets ugly quickly if you have large files, and it also kills the streaming. An alternative would be to calculate it like so:

    var body_before_file = "..."; // string fields + boundary and metadata for the file
    var body_after_file = "--myboundary--\r\n\r\n";
    var fs = require('fs');
    fs.stat(local_path_to_file, function(err, file_info) {
        var content_length = Buffer.byteLength(body_before_file) + 
                file_info.size + 
                Buffer.byteLength(body_after_file);
        // create request to google, write content-length and other headers
        // write() the body_before_file part, 
        // and then pipe the file and end the request like we did above
    

    But, that still kills your ability to stream from the user to google, the file has to be downloaded to the local disk to determine it's length.

    Alternate option

    ...now, after going through all of that, PUT might be your friend here. According to https://developers.google.com/storage/docs/reference-methods#putobject you can use a transfer-encoding: chunked header so you don't need to find the files length. And, I believe that the entire body of the request is just the file, so you can use pipe() and just let it end the request when it's done. If you're using https://github.com/felixge/node-formidable to handle uploads, then you can do something like this:

    incomingForm.onPart = function(part) {
        if (part.filename) {
            var req = ... // create a PUT request to google and set the headers
            part.pipe(req);
        } else {
            // let formidable handle all non-file parts
            incomingForm.handlePart(part);
        }
    }
    

    0 讨论(0)
提交回复
热议问题