I have a requirement to save a large amount (>100GB per day) of transactional data to a data lake gen2. The data is many small JSON transactions so I was planning to batc