I got a problem when I used sbt to run a spark job, I have finish compile, but when I run the command run
, I got the problem below
[error] (run-main-0) java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
at akka.actor.ActorCell$.<init>(ActorCell.scala:305)
at akka.actor.ActorCell$.<clinit>(ActorCell.scala)
at akka.actor.RootActorPath.$div(ActorPath.scala:152)
at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:465)
at akka.remote.RemoteActorRefProvider.<init>(RemoteActorRefProvider.scala:124)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
at scala.util.Try$.apply(Try.scala:191)
Anyone knows what should I do?
I met with the same error when I used scala-library-2.11 jar But when I replaced it with scala-library-2.10 jar . It runs fine
It is probably caused by using incompatible versions of Scala. When I downgraded from Scala 2.11 to 2.10, I forgot to modify one package version (so one package used 2.11, the rest 2.10), resulting in having the same error.
Note: I only had this problem when using IntelliJ.
If you are getting the error and here because you cannot run Jupiter notebooks with Spark 2.1 and Scala 2.11 below is how I was able to make it work. Assumes you installed Jupiter and toree
Pre-req - Make sure Docker is running else Make fails. Make sure gpg is installed else Make fails.
Build steps -
export SPARK_HOME=/Users/<path>/spark-2.1.0-hadoop2.7/
git clone https://github.com/apache/incubator-toree.git
cd incubator-toree
make clean release APACHE_SPARK_VERSION=2.1.0
pip install --upgrade ./dist/toree-pip/toree-0.2.0.dev1.tar.gz
pip freeze |grep toree
jupyter toree install --spark_home=$SPARK_HOME
========================================================================
To Start the notebook - SPARK_OPTS='--master=local[4]' jupyter notebook
I used these versions and everything works now.
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.6</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-actor_2.11</artifactId>
<version>2.3.11</version>
</dependency>
The issue could be reproduced with version 2.11.8
.
By the moment, no downgrade is required.
Just update scala-library
version to 2.12.0.
I've the exactly the same problem and got it fixed by downgrading scala 2.11.8 to 2.10.6.
I've the same issue but where do i alter the scala-library version?
Installation (on Ubuntu 16.04):
sudo apt-get install oracle-java8-installer
wget http://d3kbcqa49mib13.cloudfront.net/spark-2.0.2-bin-hadoop2.7.tgz && tar xvf spark-2.0.2-bin-hadoop2.7.tgz
pip install toree && jupyter toree install
So when I start with a notebook it tells me that I use a different scala version. But I haven't installed anything else. screenshot + scala version
My spark jars folder contains an scala-library-2.11.8.jar
file. But how tell torree to use that (or another) file for scala.
来源:https://stackoverflow.com/questions/29339005/run-main-0-java-lang-nosuchmethoderror