问题
I am trying to access a blob file in PowerShell and want to save it directly to an FTP server. How can this be done? Can this be done via the blob file URL? Or can I somehow have the file created in Powershell memory and then use $webclient.UploadFile to save it to FTP folder?
Another question related to this same download is that is there a way to just copy the file instead of getting the subdirectories copied as well? For example, I have a blob file like: dataload/files/my_blob_file. When I use the command Get-AzureStorageBlobContent -Destination $destination_path it saves the file in the same subfolder structure, but can I instead have a custom path or remove the subfolders and save it as c:\myfolder\my_blob_file instead of c:\myfolder\dataload\files\my_blob_file? I would want to accomplish this in the above FTP server.
回答1:
I am trying to access a blob file in PowerShell and want to save it directly to an FTP server.
If your FTP server is Windows based, then you can just run the script on the FTP server and download the blob into the local path of this FTP server.
Can this be done via the blob file URL?
The command "Get-AzureStorageBlobContent" doesn't accept URL as parameter. That means you need to write the code or script to achieve that. Here is a simple demo written by me:
$url = "https://accountname.blob.core.windows.net/testcontainer/Test01.txt"
$separator = "://"
$option = [System.StringSplitOptions]::RemoveEmptyEntries
$temp = $url.Split($separator,4,$option)
$Protocol = $temp[0]
$HostName = $temp[1]
$Container = $temp[2]
$BlobName = $temp[3]
Or can I somehow have the file created in Powershell memory and then use $webclient.UploadFile to save it to FTP folder?
Storing a file into the RAM is not a good ideal even if we can achieve that. As I have mentioned above, if your FTP server is Windows based, please run the script on the FTP server directly.
If the script can't be run on the server for any reason, then please try to share the folder used by the FTP service and map it as a network driver on the computer which will run the script. So that you will be able to store the file into this network driver.
but can I instead have a custom path or remove the subfolders and save it as c:\myfolder\my_blob_file instead of c:\myfolder\dataload\files\my_blob_file?
Of course, just specify the path and file name as the Destination parameter:
Note: Actually, there is no concept of "folder" on Azure storage. The path is a part of the blob name. When you download the blob, you can rename the blob by specify the file name in the destination path. So that the additional folder will not be created on local folder.
=========================================================================
Update:
This script is to be run from Azure as part of Azure Automation. But when I try to call the FTP server (which is currently my local machine) I get "Unable to connect to remote server" error.
You may need Hybrid Runbook Worker to achieve your goal.
Runbooks in Azure Automation cannot access resources in your local data center since they run in the Azure cloud. The Hybrid Runbook Worker feature of Azure Automation allows you to run runbooks on machines located in your data center to manage local resources.
I'm using the default 21 port and I also tried using my public IP address
Exposing your FTP server to the Internet is not recommended. I would suggest using Hybrid Runbook Worker.
Also, how can I get the content of my blob file into a Powershell variable to work with it in the script?
To my knowledge, Get-AzureStorageBlobContent does not support return an object in RAM. You need downloading the content first, then use Get-Content to get the file content. If you use the Hybrid Runbook Worker, you'll be able to store the file locally.
============================================================================== Update 2:
I am trying to understand as to how to call any external FTP server (which is currently on my machine for dev/test purpose, but may reside on any other external server in production), and I have to run it from a Powershell script in Azure Automation. So your reply: You may need Hybrid Runbook Worker to achieve your goal... will not work for me right?
The Hybrid Runbook Worker works for you. And it makes things easier. Because if you use Hybrid Runbook Worker, the Runbook is running on your local machine.
I'm able to download the blobs into my local machine and upload them to the public FTP server without any issue.
Are you saying that currently there is no way to upload and download files from external FTP server from Azure Powershell Automation?
I didn't successfully upload the blob to the public FTP server. Exception occurs when I try to upload the blob and only empty files with the name of the blob are uploaded to the FTP server. It might be a permission issue since the PowerShell script is running in a sandbox. That's the reason why I said that Hybrid Runbook Worker makes things easier.
In the end, please note: FTP authenticates users and transfers date in plaintext, which may cause security issue. FTPS and SFTP are more secure than FTP.
来源:https://stackoverflow.com/questions/43209366/how-to-download-a-blob-file-from-azure-storage-and-save-it-to-an-ftp-server-usin