How to use custom classes with Apache Spark (pyspark)?

两盒软妹~` 提交于 2019-11-27 07:00:51

Probably the simplest solution is to use pyFiles argument when you create SparkContext

from pyspark import SparkContext
sc = SparkContext(master, app_name, pyFiles=['/path/to/BoTree.py'])

Every file placed there will be shipped to workers and added to PYTHONPATH.

If you're working in an interactive mode you have to stop an existing context using sc.stop() before you create a new one.

Also make sure that Spark worker is actually using Anaconda distribution and not a default Python interpreter. Based on your description it is most likely the problem. To set PYSPARK_PYTHON you can use conf/spark-env.sh files.

On a side note copying file to lib is a rather messy solution. If you want to avoid pushing files using pyFiles I would recommend creating either plain Python package or Conda package and a proper installation. This way you can easily keep track of what is installed, remove unnecessary packages and avoid some hard to debug problems.

Once the SparkContext is acquired, one may also use addPyFile to subsequently ship a module to each worker.

sc.addPyFile('/path/to/BoTree.py')

pyspark.SparkContext.addPyFile(path) documentation

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!