spark-shell

“Task not serializable” with java time in Spark-shell (or zeppelin) but not in spark-submit

 ̄綄美尐妖づ 提交于 2020-12-15 08:59:16
问题 Weirdly, I found several times there's difference when running with spark-submit vs running with spark-shell (or zeppelin), though I don't believe it. With some codes, spark-shell (or zeppelin) can throw this exception, while spark-submit just works fine: org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:345) at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner

Ignoring non-spark config property: hive.exec.dynamic.partition.mode

人盡茶涼 提交于 2020-07-06 10:59:30
问题 How to run a Spark-shell with hive.exec.dynamic.partition.mode=nonstrict ? I try (as suggested here) export SPARK_MAJOR_VERSION=2; spark-shell --conf "hive.exec.dynamic.partition.mode=nonstrict" --properties-file /opt/_myPath_/sparkShell.conf' but Warning "Ignoring non-spark config property: hive.exec.dynamic.partition.mode=nonstrict" PS: using Spark version 2.2.0.2.6.4.0-91, Scala version 2.11.8 NOTE The demand arrives after error on df.write.mode("overwrite").insertInto("db.partitionedTable

Execute the scala script through spark-shell in silent mode

半腔热情 提交于 2020-02-28 13:54:18
问题 Need to execute the scala script through spark-shell with silent mode. When I am using spark-shell -i "file.scala" , after the execution, I am getting into the scala interactive mode. I don't want to get into there. I have tried to execute the spark-shell -i "file.scala". But I don't know how to execute the script in silent mode. spark-shell -i "file.scala" after execution, I get into scala> I don't want to get into the scala> mode Updating (October 2019) for a script that terminates This

Execute the scala script through spark-shell in silent mode

大憨熊 提交于 2020-02-28 13:52:15
问题 Need to execute the scala script through spark-shell with silent mode. When I am using spark-shell -i "file.scala" , after the execution, I am getting into the scala interactive mode. I don't want to get into there. I have tried to execute the spark-shell -i "file.scala". But I don't know how to execute the script in silent mode. spark-shell -i "file.scala" after execution, I get into scala> I don't want to get into the scala> mode Updating (October 2019) for a script that terminates This

Parsing Data in Apache Spark Scala org.apache.spark.SparkException: Task not serializable error when trying to use textinputformat.record.delimiter

只谈情不闲聊 提交于 2019-12-25 03:28:08
问题 Input file: ___DATE___ 2018-11-16T06:3937 Linux hortonworks 3.10.0-514.26.2.el7.x86_64 #1 SMP Fri Jun 30 05:26:04 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux 06:39:37 up 100 days, 1:04, 2 users, load average: 9.01, 8.30, 8.48 06:30:01 AM all 6.08 0.00 2.83 0.04 0.00 91.06 ___DATE___ 2018-11-16T06:4037 Linux cloudera 3.10.0-514.26.2.el7.x86_64 #1 SMP Fri Jun 30 05:26:04 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux 06:40:37 up 100 days, 1:05, 28 users, load average: 8.39, 8.26, 8.45 06:40:01 AM all 6.92

Apache Spark method not found sun.nio.ch.DirectBuffer.cleaner()Lsun/misc/Cleaner;

百般思念 提交于 2019-12-25 00:21:19
问题 I encounter this problem while running an automated data processing script in spark-shell. First couple of iterations work fine, but it always sooner or later bumps into this error. I googled this issue but haven't found an exact match. Other similar issues are outside of spark context. I guess it may have something to do with JVM version, but I cannot figure out how to solve the problem. I used 2 machines within a spark standalone cluster. Machine No.1 Java Information: java 10.0.2 2018-07

Failed to execute Cassandra CQL statement, while reading from Ignite Cache

左心房为你撑大大i 提交于 2019-12-11 15:28:43
问题 I am trying to integrate ignite with cassandra. I set up the configuration and started the ignite node. But I can not insert/read data from Ignite cache/cassandra db. I created Keyspace and table in the cassandra. And inserted some values. But when tried to read the values , the exception arises. Same thing happened when I tried to insert some values. My Ignite version is 2.6 and cqlsh 5.0.1 | Cassandra 3.11.4 | CQL spec 3.4.4 | spark version is 2.3.0 | scala version is 2.11.8 | cassandra