问题
I'm running Tasks on Microsoft Azure Batch Services, where each task create a set of files on the node. I have to copy these files to a Blob Storage.
The task are created and managed from a vm which is not part of the batch pool
I'm able to acces the node files and i can write the content to a blob storage however this means I get the file as a string on my driving vm and upload it to the blobstorage.
var container = BlobClient.GetContainerReference(containerName);
container.CreateIfNotExists();
var content = nodeFile.ReadAsString();
var blob = container.GetBlockBlobReference(nodeFile.Name);
blob.UploadText(content);
To prevent extra trafic, does anybody know a way I can upload the files directly to the BlobStorage?
I have no control over the exe in the task so uploading it from the task directly is not an option
回答1:
Updated Answer 2017-10-27:
You can now directly upload artifacts from your task with task output files with API versions greater than or equal to 2017-05-01
with Virtual Machine Configuration pools.
Original answer:
You can upload to storage directly from the compute nodes if you are able to wrap up your executable in a bat/cmd file or shell script. You can use AzCopy if your VM is Windows or blobxfer if your VM is Linux (or Windows) to transfer your files after your program has exited. You will need to install the programs as part of your compute node start task, install as part of job prep task, or include it as part of your resource files (if AzCopy) so it is available to your task.
For example on Windows nodes:
@echo off
myprogram.exe arg1 arg2
set /a rc=%ERRORLEVEL%
REM assuming return code of 0 is success
IF %rc% EQU 0 (
AzCopy.exe <azcopy args>
)
exit /b %rc%
For example on Linux nodes:
#!/usr/bin/env bash
set -e
# your program below
myprogram arg1 arg2
# invoke blobxfer to transfer output data to storage, see docs for more info
blobxfer <blobxfer args>
来源:https://stackoverflow.com/questions/33089562/azure-batch-nodefiles-to-blob-stotage