does pyspark support spark-streaming-kafka-0-10 lib?

徘徊边缘 提交于 2020-07-08 02:03:39

问题


my kafka cluster version is 0.10.0.0, and i want to use pyspark stream to read kafka data. but in Spark Streaming + Kafka Integration Guide, http://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html there is no python code example. so can pyspark use spark-streaming-kafka-0-10 to integrate kafka?

Thank you in advance for your help !


回答1:


I also use spark streaming with Kafka 0.10.0 cluster. After adding following line to your code, you are good to go.

spark.jars.packages org.apache.spark:spark-streaming-kafka-0-8_2.11:2.0.0

And here a sample in python:

# Initialize SparkContext
sc = SparkContext(appName="sampleKafka")

# Initialize spark stream context
batchInterval = 10
ssc = StreamingContext(sc, batchInterval)

# Set kafka topic
topic = {"myTopic": 1}

# Set application groupId
groupId = "myTopic"

# Set zookeeper parameter
zkQuorum = "zookeeperhostname:2181"

# Create Kafka stream 
kafkaStream = KafkaUtils.createStream(ssc, zkQuorum, groupId, topic)

#Do as you wish with your stream
# Start stream
ssc.start()
ssc.awaitTermination()



回答2:


You can use spark-streaming-kafka-0-8 when your brokers are 0.10 and later. spark-streaming-kafka-0-8 supports newer brokers versions while streaming-kafka-0-10 does not support older broker versions. streaming-kafka-0-10 as of now is still experimental and has no Python support.



来源:https://stackoverflow.com/questions/45522153/does-pyspark-support-spark-streaming-kafka-0-10-lib

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!