pyspark interpreter not found in apache zeppelin

旧巷老猫 提交于 2019-12-22 10:46:04

问题


I am having issue with using pyspark in Apache-Zeppelin (version 0.6.0) notebook. Running the following simple code gives me pyspark interpreter not found error

%pyspark
a = 1+3

Running sc.version gave me res2: String = 1.6.0 which is the version of spark installed on my machine. And running z return res0: org.apache.zeppelin.spark.ZeppelinContext = {}

  1. Pyspark works from CLI (using spark 1.6.0 and python 2.6.6)

  2. The default python on the machine 2.6.6, while anaconda-python 3.5 is also installed but not set as default python.

  3. Based on this post I updated the zeppelin-env.sh file located at /usr/hdp/current/zeppelin-server/lib/conf and added Anaconda python 3 path

export PYSPARK_PYTHON=/opt/anaconda3/bin/python
export PYTHONPATH=/opt/anaconda3/bin/python

After that I have stopped and restarted zeppelin many times using

/usr/hdp/current/zeppelin-server/lib/bin/zeppelin-daemon.sh

But I can't get the pyspark interpreter to work in zeppelin.


回答1:


To people who found out pyspark not responding, please try to restart your spark interpreter in Zeppelin,it may solve pyspark not responding error.



来源:https://stackoverflow.com/questions/38203177/pyspark-interpreter-not-found-in-apache-zeppelin

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!