Why apache spark does not work with java 10? We get illegal reflective then java.lang.IllegalArgumentException

前端 未结 3 1286
梦如初夏
梦如初夏 2021-02-05 09:51

Is there any technical reason why spark 2.3 does not work with java 1.10 (as of July 2018)?

Here is the output when I run SparkPi example using spark-submit

相关标签:
3条回答
  • 2021-02-05 10:26

    Spark depends on the memory API's which has been changed in JDK 9 so it is not available starting JDK 9.

    And that is the reason for this.

    Please check the issue:

    https://issues.apache.org/jira/browse/SPARK-24421

    0 讨论(0)
  • 2021-02-05 10:33

    Primary technical reason is that Spark depends heavily on direct access to native memory with sun.misc.Unsafe, which has been made private in Java 9.

    • https://issues.apache.org/jira/browse/SPARK-24421
    • http://apache-spark-developers-list.1001551.n3.nabble.com/Java-9-td20875.html
    0 讨论(0)
  • 2021-02-05 10:35

    Committer here. It's actually a fair bit of work to support Java 9+: SPARK-24417

    It's also almost done and should be ready for Spark 3.0, which should run on Java 8 through 11 and beyond.

    The goal (well, mine) is to make it work without opening up module access. The key issues include:

    • sun.misc.Unsafe usage has to be removed or worked around
    • Changes to the structure of boot classloader
    • Scala support for Java 9+
    • A bunch of dependency updates to work with Java 9+
    • JAXB no longer automatically available
    0 讨论(0)
提交回复
热议问题