mapToPair() in Spark Java job

后端 未结 0 1105
礼貌的吻别
礼貌的吻别 2021-02-03 13:54

Our aim is to have pair RDD then by applying reduceByKey() method to aggregate data separately for each key. Why we need to call special version of Spark’s function

相关标签:
回答
  • 消灭零回复
提交回复
热议问题