Is there any technical reason why spark 2.3 does not work with java 1.10 (as of July 2018)?
Here is the output when I run SparkPi example using spark-submit
Spark depends on the memory API's which has been changed in JDK 9 so it is not available starting JDK 9.
And that is the reason for this.
Please check the issue:
https://issues.apache.org/jira/browse/SPARK-24421
Primary technical reason is that Spark depends heavily on direct access to native memory with sun.misc.Unsafe
, which has been made private in Java 9.
Committer here. It's actually a fair bit of work to support Java 9+: SPARK-24417
It's also almost done and should be ready for Spark 3.0, which should run on Java 8 through 11 and beyond.
The goal (well, mine) is to make it work without opening up module access. The key issues include:
sun.misc.Unsafe
usage has to be removed or worked around