Worker doesn't recognize “python3.6” although having it when starting Spark standalone cluster with start-all.sh script

前端 未结 0 708
广开言路
广开言路 2021-01-16 09:30

I set PYSPARK_PYTHON=python3.6. I also check both master and worker\'s python version and they both have python3.6 as runnable program. However whe

相关标签:
回答
  • 消灭零回复
提交回复
热议问题