I have two versions of Python. When I launch a spark application using spark-submit, the application uses the default version of Python. But, I want to use the other one. How to
You can either specify the version of Python by listing the path to your install in a shebang line in your script:
myfile.py:
#!/full/path/to/specific/python2.7
or by calling it on the command line without a shebang line in your script:
/full/path/to/specific/python2.7 myfile.py
However, I'd recommend looking into Python's excellent virtual environments that will allow you to create separate "environments" for each version of Python. Virtual environments more or less work by handling all the path specification after you activate them, alllowing you to just type python myfile.py
without worrying about conflicting dependencies or knowing the full path to a specific version of python.
Click here for an excellent guide to getting started with Virtual Environments or [here] for the Python3 official documentation.
If you do not have access to the nodes and you're running this using PySpark, you can specify the Python version in your spark-env.sh:
Spark_Install_Dir/conf/spark-env.sh:
PYSPARK_PYTHON = /full/path/to/python_executable/eg/python2.7