azure-data-factory

Pagination with oauth azure data factory

旧城冷巷雨未停 提交于 2021-01-05 07:26:50
问题 Inside Azure data factory i make a call to microsoft graph through a REST copy activity utilizing rest to get an access token to the service. The Graph api returns max 200 results, and therefore i am interested in using the pagination rules that can be created in the source. In post man i can see that my response structure is { "@odata.context" : <some context>, "@odata.nextLink" : <the link to next page> "value" : [<the response data on current page>] } I have read in the documentation that

Pagination with oauth azure data factory

僤鯓⒐⒋嵵緔 提交于 2021-01-05 07:26:25
问题 Inside Azure data factory i make a call to microsoft graph through a REST copy activity utilizing rest to get an access token to the service. The Graph api returns max 200 results, and therefore i am interested in using the pagination rules that can be created in the source. In post man i can see that my response structure is { "@odata.context" : <some context>, "@odata.nextLink" : <the link to next page> "value" : [<the response data on current page>] } I have read in the documentation that

Execute azure data factory foreach activity with start date and end date

半城伤御伤魂 提交于 2021-01-04 14:45:23
问题 I have a json file and it contains the start date and end date. I need to iterate over this start date and end date with azure data factory foreach activity. As per my knowledge, the foreach expect the items (collection/array). But in my case, I have only two items which are start and end date. I want to run the data factory for process some historic data. I don't have the collection of dates, so how can I iterate this with start date and end date? If someone can help me to figure it out, it

Azure Data Factory Get Metadata to get blob filenames and transfer them to Azure SQL database table

心不动则不痛 提交于 2020-12-15 08:34:51
问题 I am trying to use Get Metadata activity in Azure Data Factory in order to get blob filenames and copy them to Azure SQL database table. I follow this tutorial: https://www.mssqltips.com/sqlservertip/6246/azure-data-factory-get-metadata-example/ Here is my pipeline, Copy Data > Source is the source destination of the blob files in my Blob storage. I need to specify my source file as binary because they are *.jpeg files. For my Copy Data > Sink, its the Azure SQL database, I enable the option

Azure Data Factory Get Metadata to get blob filenames and transfer them to Azure SQL database table

大兔子大兔子 提交于 2020-12-15 08:34:25
问题 I am trying to use Get Metadata activity in Azure Data Factory in order to get blob filenames and copy them to Azure SQL database table. I follow this tutorial: https://www.mssqltips.com/sqlservertip/6246/azure-data-factory-get-metadata-example/ Here is my pipeline, Copy Data > Source is the source destination of the blob files in my Blob storage. I need to specify my source file as binary because they are *.jpeg files. For my Copy Data > Sink, its the Azure SQL database, I enable the option

How to remove escape character at an JSON string array?

橙三吉。 提交于 2020-12-15 05:16:30
问题 In Azure Data Factory, I have Lookup activity. "Lookup" activity, which reads the JSON Data from SQL DB (more than 1 row) and bring into ADF Pipeline. The lookup activity output contains escape character. Please see this: {\ "resourceType\ ":\ "counter","id" :\ "9i5W6tp-JTd-24252\ " How to remove escape character? Any help is appreciated. 回答1: Since your query result is an JSON String array, we need to do more to remove escape character. Here is my steps: Firstly, we can define two array type

How do I store run-time data in Azure Data Factory between pipeline executions?

北慕城南 提交于 2020-12-15 03:38:44
问题 I have been following Microsoft's tutorial to incrementally/delta load data from an SQL Server database. It uses a watermark (timestamp) to keep track of changed rows since last time. The tutorial stores the watermark to an Azure SQL database using the "Stored Procedure" activity in the pipeline so it can be reused in the next execution. It seems overkill to have an Azure SQL database just to store that tiny bit of meta information (my source database is read-only btw). I'd rather just store

Azure Data Factory - filter Mongodb source dataset by date

萝らか妹 提交于 2020-12-12 07:19:31
问题 This scenario is pretty straightforward, as described in ADFv2 docs and samples, I've created a copy pipeline to get the data from MongoDB collection, and write it to Azure SQL database. Full collection data is successfully transfered and all the mappings are set correctly. The problem starts when I try to filter the source dataset to get only the last n days from MongoDB. I've tried several queries, and cross-checked with MongoDB Compass to see if they're actually executing Mongo side, which

How to transform xml data using datafactory pipeline

不打扰是莪最后的温柔 提交于 2020-11-30 12:07:27
问题 How do we save data inside of an XML payload to blob storage? input <root> <alexIsAwesome>yes he is</alexIsAwesome> <bytes>sdfsdfjijOIJOISJDFQPWORPJkjsdlfkjlksdf==</bytes> </root> desired result <root> <alexIsAwesome>yes he is</alexIsAwesome> <bytes>/blob/path/toSavedPayload</bytes> </root> save bytes somewhere in blob replace bytes with URI of where bytes were saved How do we use data factory to extract a node from XML and save it to blob? 回答1: Currently, ADF doesn’t support XML natively.

How to transform xml data using datafactory pipeline

走远了吗. 提交于 2020-11-30 12:00:10
问题 How do we save data inside of an XML payload to blob storage? input <root> <alexIsAwesome>yes he is</alexIsAwesome> <bytes>sdfsdfjijOIJOISJDFQPWORPJkjsdlfkjlksdf==</bytes> </root> desired result <root> <alexIsAwesome>yes he is</alexIsAwesome> <bytes>/blob/path/toSavedPayload</bytes> </root> save bytes somewhere in blob replace bytes with URI of where bytes were saved How do we use data factory to extract a node from XML and save it to blob? 回答1: Currently, ADF doesn’t support XML natively.