Writing large spark data frame as parquet to s3 bucket

前端 未结 0 451
南笙
南笙 2021-01-22 01:08

My Scenario

  • I have a spark data frame in a AWS glue job with 4 million records
  • I need to write it as a SINGLE parquet file in AWS s3
相关标签:
回答
  • 消灭零回复
提交回复
热议问题