问题
I'm trying to hook-up an Apache Spark Structured Stream to a MQTT topic (IBM Watson IoT Platform on IBM Bluemix in this case).
I'm creating the structured stream as follows:
val df = spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("username","<username>")
.option("password","<password>")
.option("clientId","a:vy0z2s:a-vy0z2s-zfzzckrnqf")
.option("topic", "iot-2/type/WashingMachine/id/Washer02/evt/voltage/fmt/json")
.load("tcp://vy0z2s.messaging.internetofthings.ibmcloud.com:1883")
So far so good, in REPL I get back this df object as follows:
df: org.apache.spark.sql.DataFrame = [value: string, timestamp: timestamp]
But if I start to read from the stream using this line:
val query = df.writeStream
.outputMode("append")
.format("console")
.start()
I get the following error:
scala> 17/02/03 07:32:23 ERROR StreamExecution: Query query-1
terminated with error java.lang.ClassCastException: scala.Tuple2
cannot be cast to scala.runtime.Nothing$ at
org.apache.bahir.sql.streaming.mqtt.MQTTTextStreamSource$$anonfun$getBatch$1$$anonfun$3.apply(MQTTStreamSource.scala:156)
at
org.apache.bahir.sql.streaming.mqtt.MQTTTextStreamSource$$anonfun$getBatch$1$$anonfun$3.apply(MQTTStreamSource.scala:156)
at scala.collection.MapLike$class.getOrElse(MapLike.scala:128) at
scala.collection.concurrent.TrieMap.getOrElse(TrieMap.scala:633) at
org.apache.bahir.sql.streaming.mqtt.MQTTTextStreamSource$$anonfun$getBatch$1.apply$mcZI$sp(MQTTStreamSource.scala:156)
at
org.apache.bahir.sql.streaming.mqtt.MQTTTextStreamSource$$anonfun$getBatch$1.apply(MQTTStreamSource.scala:155)
at
org.apache.bahir.sql.streaming.mqtt.MQTTTextStreamSource$$anonfun$getBatch$1.apply(MQTTStreamSource.scala:155)
at scala.collection.immutable.Range.foreach(Range.scala:160) at
org.apache.bahir.sql.streaming.mqtt.MQTTTextStreamSource.getBatch(MQTTStreamSource.scala:155)
at
org.apache.spark.sql.execution.streaming.StreamExecution$$anonfun$5.apply(StreamExecution.scala:332)
at
org.apache.spark.sql.execution.streaming.StreamExecution$$anonfun$5.apply(StreamExecution.scala:329)
at
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.Iterator$class.foreach(Iterator.scala:893) at
scala.collection.AbstractIterator.foreach(Iterator.scala:1336) at
scala.collection.IterableLike$class.foreach(IterableLike.scala:72) at
org.apache.spark.sql.execution.streaming.StreamProgress.foreach(StreamProgress.scala:25)
at
scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
at
org.apache.spark.sql.execution.streaming.StreamProgress.flatMap(StreamProgress.scala:25)
at
org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runBatch(StreamExecution.scala:329)
at
org.apache.spark.sql.execution.streaming.StreamExecution$$anonfun$org$apache$spark$sql$execution$streaming$StreamExecution$$runBatches$1.apply$mcZ$sp(StreamExecution.scala:194)
at
org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:43)
at
org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runBatches(StreamExecution.scala:184)
at
org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:120)
17/02/03 07:32:24 WARN MQTTTextStreamSource: Connection to mqtt server
lost. Connection lost (32109) - java.io.EOFException at
org.eclipse.paho.client.mqttv3.internal.CommsReceiver.run(CommsReceiver.java:146)
at java.lang.Thread.run(Thread.java:745) Caused by:
java.io.EOFException at
java.io.DataInputStream.readByte(DataInputStream.java:267) at
org.eclipse.paho.client.mqttv3.internal.wire.MqttInputStream.readMqttWireMessage(MqttInputStream.java:65)
at
org.eclipse.paho.client.mqttv3.internal.CommsReceiver.run(CommsReceiver.java:107)
... 1 more 17/02/03 07:32:28 WARN MQTTTextStreamSource: Connection to
mqtt server lost.
My gut feeling says that there is something wrong with the schema, so I've added one:
import org.apache.spark.sql.types._ val
schema = StructType(
StructField("count",LongType,true)::
StructField("flowrate",LongType,true)::
StructField("fluidlevel",StringType,true)::
StructField("frequency",LongType,true)::
StructField("hardness",LongType,true)::
StructField("speed",LongType,true)::
StructField("temperature",LongType,true)::
StructField("ts",LongType,true)::
StructField("voltage",LongType,true):: Nil)
val df = spark.readStream
.schema(schema)
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("username","<username>")
.option("password","<password>")
.option("clientId","a:vy0z2s:a-vy0z2s-zfzzckrnqf")
.option("topic", "iot-2/type/WashingMachine/id/Washer02/evt/voltage/fmt/json")
.load("tcp://vy0z2s.messaging.internetofthings.ibmcloud.com:1883")
But this doesn't help, any ideas?
回答1:
It seems your issue is because you are re-using the same client ID for subsequent connections
Closing TCP connection: ClientID="a:vy0z2s:a-vy0z2s-xxxxxxxxxx" Protocol=mqtt4-tcp Endpoint="mqtt" RC=288 Reason="The client ID was reused."
Only one unique connection is allowed per clientID; you can not have two concurrent connections using the same ID.
Please check the client ID and make sure that multiple instances of the same app use a unique client ID. Applications can share the same API key, but MQTT requires that the client ID is always unique.
来源:https://stackoverflow.com/questions/42018183/schema-issue-with-apachebahir-stuctured-streaming-connector-on-apachespark-strea