问题
I have a copy activity used in my pipeline to copy files from Azure data Lake gen 2. The source location may have 1000's of files and the files are required to be copied but we need to set a limit for the number files required to be copied. Is there any option available in ADF to achieve the same barring a custom activity?
Eg: I have 2000 files available in Data lake, but while running the pipeline i should able to pass a parameter to copy only 500 files.
Regards, Sandeep
回答1:
I think you can use a lookup activity with the for each loop and a copy activity to achieve this . You will have to use a counter variable also ( this is make the process slow , as you will have to copy i file at a time ) . The loopkup acitivty has a limit of 5000 at this time , so you will have to keep that in mind .
回答2:
I would use metadata activity to get a list of all items in your data lake: https://docs.microsoft.com/en-us/azure/data-factory/control-flow-get-metadata-activity
After that, you can use a "ForEach" step to loop through the list of files and copy them. In order to set a limit, you can use create two variables/parameters: limit and files_copied. In the beginning of each step, check if your files_copied is less than limit, perform the copy operation and add 1 to files_copied.
Alternatively, you can create a database with the names of all the files after the first step and then use lookup and for each steps, just like @HimanshuSinha-msft mentioned. In the Lookup step you can use SQL OFFSET+FETCH query in combination with your limit parameter to process only certain number of files. That can also solve the 5k limit of the lookup activity.
来源:https://stackoverflow.com/questions/61520585/azure-data-factory-set-a-limit-to-copy-number-of-files-using-copy-activity