问题
I'm trying to consume a kafka 0.8 topic using spark-streaming2.0.0, i'm trying to identify the required dependencies i have tried using these dependencies in my build.sbt file
libraryDependencies += "org.apache.spark" %% "spark-streaming_2.11" % "2.0.0"
when i run sbt package i'm getting unresolved dependencies for all three these jars,
But these jars do exist
https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-8_2.11/2.0.0
Please help in debugging this issue, I'm new to Scala so please let me know if i'm not doing something right
回答1:
The problem is that you're specifying the Scala version and also using %%
which tries to infer which Scala version you're using.
Either remove one %
:
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-8_2.11" % "2.0.0"
Or remove the Scala version:
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0"
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.0.0"
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka-0-8" % "2.0.0"
来源:https://stackoverflow.com/questions/39516992/how-to-define-kafka-data-source-dependencies-for-spark-streaming