ClassNotFoundException: com.datastax.spark.connector.rdd.partitioner.CassandraPartition

只谈情不闲聊 提交于 2019-12-11 15:24:04

问题


I am using

Spark version 2.2.1
Using Scala version 2.11.8
OpenJDK 64-Bit Server VM, 1.8.0_131

I have add jar dependency by using code

JavaSparkContext sc = new JavaSparkContext(conf);
        sc.addJar("./target/CassandraSparkJava-1.0-SNAPSHOT-jar-with-dependencies.jar");

Executing below code, but facing ClassNotFoundException:com.datastax.spark.connector.rdd.partitioner.CassandraPartition

Dataset<org.apache.spark.sql.Row> dataset = sparksession.read().format("org.apache.spark.sql.cassandra")
            .options(new HashMap<String, String>() {
                {
                    put("keyspace", "test_db");
                    put("table", "stats");
                }
            }).load().where("timestamp = '2018-01-09 00:00:00'");
    System.out.println(" Result " + dataset.collectAsList().size());

Here is my pom.xml file

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.test.spark.poc</groupId>
    <artifactId>CassandraSparkJava</artifactId>
    <version>1.0-SNAPSHOT</version>
    <dependencies>
    <!-- https://mvnrepository.com/artifact/org.apache.felix/maven-bundle-plugin -->
 <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.2.1</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.2.1</version>
        <scope>provided</scope>
    </dependency>
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector_2.11</artifactId>
            <version>2.0.0-M3</version>
        </dependency>
        <dependency>
            <groupId>com.datastax.cassandra</groupId>
            <artifactId>cassandra-driver-core</artifactId>
            <version>3.3.0</version>
        </dependency>
        <dependency>
      <groupId>com.google.guava</groupId>
      <artifactId>guava</artifactId>
      <version>18.0</version>
    </dependency>
    <dependency>
      <groupId>io.reactivex</groupId>
      <artifactId>rxjava</artifactId>
      <version>1.0.0-rc.8</version>
    </dependency>
    </dependencies>
    <build>
        <plugins>
        <!-- A plugin for compiling Java code -->
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>3.6.1</version>
            <configuration>
                <source>1.8</source>
                <target>1.8</target>
                <archive>
                        <manifest>
                            <mainClass>com.test.spark.poc.CassandraSparkMockStats</mainClass>
                        </manifest>
                    </archive>
            </configuration>
        </plugin>

        <!-- Or, a plugin for compiling Scala code -->
        <!-- Make sure you are not using "maven-scala-plugin", which is the older version -->
        <plugin>
            <groupId>net.alchim31.maven</groupId>
            <artifactId>scala-maven-plugin</artifactId>
            <version>3.2.2</version>
            <executions>
                <execution>
                    <goals>
                        <goal>compile</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>             
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <!-- get all project dependencies -->
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                    <!-- MainClass in mainfest make a executable jar -->
                    <archive>
                      <manifest>
                        <mainClass>com.test.spark.poc.CassandraSparkMockStats</mainClass>
                      </manifest>
                    </archive>

                </configuration>
                <executions>
                  <execution>
                    <id>make-assembly</id>
                                        <!-- bind to the packaging phase -->
                    <phase>package</phase>
                    <goals>
                        <goal>single</goal>
                    </goals>
                  </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</project>

Help me to resolve this issue

来源:https://stackoverflow.com/questions/49002925/classnotfoundexception-com-datastax-spark-connector-rdd-partitioner-cassandrapa

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!