In Apache Spark, Is it possible to specify partition's preferred location for a shuffled RDD or a cogrouped RDD?

前端 未结 0 748
太阳男子
太阳男子 2021-01-12 05:00

As for Spark 1.6+, the only API that supports customizing partition location is when the RDD is created:

  /** Distribute a local Scala collection to form an         


        
相关标签:
回答
  • 消灭零回复
提交回复
热议问题