Spark: rdd.count() and rdd.write() are executing transformations twice

前端 未结 0 698
不思量自难忘°
不思量自难忘° 2020-12-30 16:41

I am using Apache Spark to fetch records from database and after some transformations, writing them to AWS S3. Now I also want to count the no of records I am writing to S3

相关标签:
回答
  • 消灭零回复
提交回复
热议问题