Getting BusyPoolException com.datastax.spark.connector.writer.QueryExecutor , what wrong me doing?
问题 I am using spark-sql-2.4.1 ,spark-cassandra-connector_2.11-2.4.1 with java8 and apache cassandra 3.0 version. I have my spark-submit or spark cluster environment as below to load 2 billion records. --executor-cores 3 --executor-memory 9g --num-executors 5 --driver-cores 2 --driver-memory 4g Using following configurration cassandra.concurrent.writes=1500 cassandra.output.batch.size.rows=10 cassandra.output.batch.size.bytes=2048 cassandra.output.batch.grouping.key=partition cassandra.output