can we create two parquet files using sparksql (DataFrame)?

后端 未结 0 1849
醉话见心
醉话见心 2021-02-12 02:40

I am loading table from Netezza to hive and table size is huge and its creating parquet file which size is more than 1 GB.

So is there a way that while loading table I ca

相关标签:
回答
  • 消灭零回复
提交回复
热议问题