ClassNotFoundException: com.datastax.spark.connector.rdd.partitioner.CassandraPartition
问题 I am using Spark version 2.2.1 Using Scala version 2.11.8 OpenJDK 64-Bit Server VM, 1.8.0_131 I have add jar dependency by using code JavaSparkContext sc = new JavaSparkContext(conf); sc.addJar("./target/CassandraSparkJava-1.0-SNAPSHOT-jar-with-dependencies.jar"); Executing below code, but facing ClassNotFoundException:com.datastax.spark.connector.rdd.partitioner.CassandraPartition Dataset<org.apache.spark.sql.Row> dataset = sparksession.read().format("org.apache.spark.sql.cassandra")