What is the Data size limit of DBFS in Azure Databricks
问题 I read here that storage limit on AWS Databricks is 5TB for individual file and we can store as many files as we want So does the same limit apply to Azure Databricks? or, is there some other limit applied on Azure Databricks? Update: @CHEEKATLAPRADEEP Thanks for the explanation but, can someone please share the reason behind: "we recommend that you store data in mounted object storage rather than in the DBFS root" I need to use DirectQuery (because of huge data size) in Power BI and ADLS