How to add spark dependencies in spring-boot multi module Java 11 project

♀尐吖头ヾ 提交于 2020-01-25 05:43:05

问题


Whenever I am adding a module-info.java in my multi-module project I cannot import my spark dependencies - everything else seems to be working

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.12</artifactId>
    <version>3.0.0-preview2</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.12</artifactId>
    <version>3.0.0-preview2</version>
</dependency>

IntelliJ tries to readd Maven Dependency without any result.

My module-info looks like:

module common {
    exports [...] 
    requires lombok;
    requires spring.data.jpa;
    requires spring.data.commons;
    requires org.apache.commons.lang3;
    requires spring.context;
    requires spring.web;
    requires spring.security.core;
    requires com.google.common;
    requires org.json;
    requires spring.core;
    requires spring.beans;
    requires com.fasterxml.jackson.core;
    requires com.fasterxml.jackson.databind;
    requires spring.jcl;
    requires spring.webmvc;
    requires mongo.java.driver;
    requires org.hibernate.orm.core;
    requires com.fasterxml.jackson.dataformat.csv;
    requires java.sql;
}

It is not possible to add org.apache.* in my module-info.java either.

Is it possible that spark is not ready for Jigsaw modules and Java 9+?

Thanks


回答1:


Is it possible that spark is not ready for Jigsaw modules and Java 9+?

It does hold true for spark. Two straight reasons that I can vouch for are:

  1. They do not have an entry for

    Automatic-Module-Name: <module-name> 
    

    in the artifact's MANIFEST.MF file.

  2. If you try describing their artifacts using the jar tool

    jar --describe-module --file=<complete-path>/spark-core_2.12-3.0.0-preview2.jar
    

    This would fail to derive the module descriptor for a similar reason as mentioned in this answer.


Few resources that might be useful once you reach here:

  • The reason why deriving automatic module name fails for spark artifacts
  • A way to update a jar manually with the MANIFEST entry
  • Spark's progress to Build and Run on JDK-11


来源:https://stackoverflow.com/questions/59844195/how-to-add-spark-dependencies-in-spring-boot-multi-module-java-11-project

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!