Spark2 session for Cassandra , sql queries
问题 In Spark-2.0 what is the best way to create a Spark session. Because in both Spark-2.0 and Cassandra- the APIs have been reworked, essentially deprecating the SqlContext (and also CassandraSqlContext). So for executing SQL- either I create a Cassandra Session (com.datastax.driver.core.Session) and use execute( " ") . Or I have to create a SparkSession (org.apache.spark.sql.SparkSession) and execute sql(String sqlText) method. I don't know the SQL limitations of either - can someone explain.