Why does Spark Cassandra Connector fail with NoHostAvailableException?

后端 未结 1 488
夕颜
夕颜 2021-01-19 04:32

I am having problems getting Spark Cassandra Connector working in Scala.

I\'m using these versions:

  • Scala 2.10.4
  • spark-core 1.0.2
  • cas
相关标签:
1条回答
  • 2021-01-19 05:10

    Local in this context is specifying the Spark master (telling it to run in local mode) and not the Cassandra connection host.

    To set the Cassandra Connection host you have to set a different property in the Spark Config

    import org.apache.spark._
    
    val conf = new SparkConf(true)
            .set("spark.cassandra.connection.host", "IP Cassandra Is Listening On")
            .set("spark.cassandra.username", "cassandra") //Optional            
            .set("spark.cassandra.password", "cassandra") //Optional
    
    val sc = new SparkContext("spark://Spark Master IP:7077", "test", conf)
    

    https://github.com/datastax/spark-cassandra-connector/blob/master/doc/1_connecting.md

    0 讨论(0)
提交回复
热议问题