transform data in azure data factory using python data bricks
问题 I have the task to transform and consolidate millions of single JSON file into BIG CSV files. The operation would be very simple using a copy activity and mapping the schemas, I have already tested, the problem is that a massive amount of files have bad JSON format. I know what is the error and the fix is very simple too, I figured that I could use a Python Data brick activity to fix the string and then pass the output to a copy activity that could consolidate the records into a big CSV file.