I want to run a spark streaming application on a yarn cluster on a remote server. The default java version is 1.7 but i want to use 1.8 for my application which is also there in
Although you can force the Driver code to run on a particular Java version (export JAVA_HOME=/path/to/jre/ && spark-submit ...
), the workers will execute the code with the default Java version from the yarn user's PATH from the worker machine.
What you can do is set each Spark instance to use a particular JAVA_HOME
by editing the spark-env.sh
files (documentation).