azure-data-factory-2

Parameterize Self hosted integration runtime in ADF ARM Template

核能气质少年 提交于 2021-01-29 10:49:19
问题 We have different ADF environments such as TST,UAT & prod. In each environment the self hosted IR name is different. Is it possible to parameterize the integration runtime in ADF ARM Template, so that when the ARM templates deploys through CI/CD we can pass IR name in the DevOps Variable group during the deployment. I have tried changing the Parametrization Template in ADF with below setting, but the IR name is still not available in the ARM template parameter JSON 'arm_template_parameters

Azure data factory data flow silently NULLing date column

只愿长相守 提交于 2021-01-29 09:57:35
问题 I'm trying to use Azure Data Factory to upsert a CSV into an Azure SQL table. All seemed well until I checked the results. One of the columns is a nullable date. The CSV contains a value like so 1/2/2020 12:00:00 AM . The data flow silently inserts a NULL instead of throwing an error because it didn't like the input. So how can I get my data flow to convert the string to a datetime properly, and then to error out on issues like this in the future? I really don't want silent failures and bad

Oracle Store Procedure in Azure Data Factory V2

生来就可爱ヽ(ⅴ<●) 提交于 2021-01-28 19:06:30
问题 I have created a store procedure in Oracle. Can anyone let me know if there is a procedure to execute this oracle Store Procedure from Azure Data Factory V2? 回答1: You could try preCopyScript in copy activity. You could try custom activity. 来源: https://stackoverflow.com/questions/51591123/oracle-store-procedure-in-azure-data-factory-v2

Retrieve blob file name in Copy Data activity

一笑奈何 提交于 2021-01-28 11:16:54
问题 I download json files from a web API and store them in blob storage using a Copy Data activity and binary copy. Next I would like to use another Copy Data activity to extract a value from each json file in the blob container and store the value together with its ID in a database. The ID is part of the filename, but is there some way to extract the filename? 回答1: You can do the following set of activities: 1) A GetMetadata activity, configure a dataset pointing to the blob folder, and add the

Azure Data Factory V2 Copy Activity - Save List of All Copied Files

旧街凉风 提交于 2021-01-28 11:16:42
问题 I have pipelines that copy files from on-premises to different sinks, such as on-premises and SFTP. I would like to save a list of all files that were copied in each run for reporting. I tried using Get Metadata and For Each, but not sure how to save the output to a flat file or even a database table. Alternatively, is it possible to fine the list of object that are copied somewhere in the Data Factory logs? Thank you 回答1: Update: Items: @activity('Get Metadata1').output.childItems If you

Copy Different type of file from Gen1 Azur lake to Azur Gen2 lake with attribute( like last updated)

余生颓废 提交于 2021-01-28 06:24:51
问题 I need to migrate all my data from Azur data lake Gen1 to Lake Gen2. In my lake we have different types of file mixed (.txt, .zip,.json and many other). We want to move them as-it-is to GEN2 lake. Along with that we also want to maintain last updated time for all files as GEN1 lake. I was looking to use ADF for this use case. But for that we need to define dataset, and to define dataset we have to define data format(Avro,json,xml, binary etc). As we have different type of data mixed, I tried

Enable Publishing in 'Data Factory' mode

蓝咒 提交于 2021-01-27 22:17:26
问题 I have enabled Git on my Azure Data Factory. I have created my Git Repository also. When I want to create a new Pipeline I have this message : You have GIT enabled in your data factory. Publishing in 'Data Factory' mode is disabled. Please switch back to GIT mode to make further changes. When I want to chose GitHub there is a popup "You do not have access to the repository" How can I give access to repository? Right now I have given below access to my user: 回答1: I had the same issue. What I

Pagination with oauth azure data factory

旧城冷巷雨未停 提交于 2021-01-05 07:26:50
问题 Inside Azure data factory i make a call to microsoft graph through a REST copy activity utilizing rest to get an access token to the service. The Graph api returns max 200 results, and therefore i am interested in using the pagination rules that can be created in the source. In post man i can see that my response structure is { "@odata.context" : <some context>, "@odata.nextLink" : <the link to next page> "value" : [<the response data on current page>] } I have read in the documentation that

Pagination with oauth azure data factory

僤鯓⒐⒋嵵緔 提交于 2021-01-05 07:26:25
问题 Inside Azure data factory i make a call to microsoft graph through a REST copy activity utilizing rest to get an access token to the service. The Graph api returns max 200 results, and therefore i am interested in using the pagination rules that can be created in the source. In post man i can see that my response structure is { "@odata.context" : <some context>, "@odata.nextLink" : <the link to next page> "value" : [<the response data on current page>] } I have read in the documentation that

Execute azure data factory foreach activity with start date and end date

半城伤御伤魂 提交于 2021-01-04 14:45:23
问题 I have a json file and it contains the start date and end date. I need to iterate over this start date and end date with azure data factory foreach activity. As per my knowledge, the foreach expect the items (collection/array). But in my case, I have only two items which are start and end date. I want to run the data factory for process some historic data. I don't have the collection of dates, so how can I iterate this with start date and end date? If someone can help me to figure it out, it