I\'m running this snippet to sort an RDD of points, ordering the RDD and taking the K-nearest points from a given point:
def getKNN(sparkContext:SparkContext, k:
I was also facing the same issue. after a lot of googling I found that I have made a singleton class for SparkContext initialization which is only valid for a single JVM instance, but in case of Spark this singleton class will be invoked from each worker node running on separate JVM instance and hence lead to multiple SparkContext object.