ftp to azure storage blob (triggered processing)

后端 未结 5 2079
醉酒成梦
醉酒成梦 2021-01-21 11:06

I want to transfert encrypted files from an ftp server to `azure blob storage container

Here is the workfow in question:

CSV

相关标签:
5条回答
  • 2021-01-21 11:24

    I would recommend using the (Azure Function) or (Web Job)

    Here are two patterns: - Using Docker Containers to perform a transform (copy in this case): https://azure.microsoft.com/en-us/blog/microsoft-azure-block-blob-storage-backup/ - Using a function to perform an operation after a blob created event: https://cmatskas.com/copy-azure-blob-data-between-storage-accounts-using-functions/

    Please let me know if you have any additional questions.

    0 讨论(0)
  • 2021-01-21 11:25

    There are a few key factors you should take into consideration while choosing between Azure Webjobs and Azure Function.

    Azure Functions have two types of billing schemes: Consumption Plan and App Service Plan.

    In consumption you pay only for a time when you function is running, however, under consumption plan, your function can't run more than 10 minutes. Which means, that if your jobs run more than 10 minutes Consumption plan is not for you.

    App Service plan is the same plan used by Azure Web Jobs, you don't have time limitation here (as per documentation).

    In general, Azure Functions are good when you need flexible logic with different triggers, etc.

    0 讨论(0)
  • 2021-01-21 11:26

    After some researches and based on the answer of evilSnobu and the comments of Johns-305, i figured out that the best way to do this is like following...

    note: I have an Azure Api App developed to do content decryption

    Based on this grid, the best choice here is obviously logic apps to design my workflow:

    Inside my Logic App

    1. Create ftp trigger : When a files is added on ftp -> Create a blob on Azure storage & Delete file from ftp
    2. Create an Azure function

      (Azure function vs web jobs in the below grid)

      based on blob creation trigger, when a blob is created call decryption api app.

    3. For granularity reasons and to make azure function do only one elementary job, i have to create a second Azure function to do the file parsing and creation of by version-folders depending on version field content of the file

    And based on the following grid, we can tell why azure Functions fit better than web jobs in my case

    Finally, summarize this, i can say that in my case, i need to have a developer view of my solutions so that's why i needed the logic app mainly, then i have to do two elementary tasks which are trigger based not continious so that's better suited to Azure Functions and a lot cheaper (since files are not big and processing will be very quick)

    0 讨论(0)
  • 2021-01-21 11:29

    You can achieve this by logic app and function app as follows:

    1. create a ftp trigger(when the file arrives)
    2. If it is simple Encode Decode you can use corresponding shape or else you can create one Azure Function under consumption plan(for pricing according to the usage) which has encryption,decryption functionality where the data will be passed from FTP trigger shape. This requires coding you can develop it by VS Code or Visual studio.
    3. Then you can do parsing from the output of Azure Function using parse or you can use transform shape for your data formats(XML, JSON etc) and you can use decrypt again using Azure function which you wrote above just different methods inside the same function.
    4. Finally use the Blob shape to push the output of the Decryption to the blob storage Container.

    Logic apps gives you wide usage of connectors making it easy to connect to different artefacts and a workflow approach, you can also use transformation with XSLT, liquid using integration account if needed.

    Hope this helps Cheers!!

    0 讨论(0)
  • 2021-01-21 11:33

    Don't overengineer it.

    Use a Logic App to poll the FTP server and detect new files, place them in blob storage.

    Create a blob-triggered Azure Function (Consumption Plan, v2 runtime) and do your data transformation in code (in v2 you have a choice between TypeScript, JavaScript, C# and Python). Write the results to blob storage with a blob output binding.

    OPTIONAL Have a second Logic App trigger on the resulting blobs and e-mail/text you notifications.

    0 讨论(0)
提交回复
热议问题