Spark with Flume (configuration/classpath?)
问题 I am trying to get Spark working with Flume, flume config below: #Declare log.sources = src log.sinks = spark log.channels = chs #Define Source log.sources.src.type = exec log.sources.src.command = sh /home/user/shell/flume.sh #Define Sink log.sinks.spark.type = org.apache.spark.streaming.flume.sink.SparkSink log.sinks.spark.hostname = localhost log.sinks.spark.port = 9999 log.sinks.spark.channel = chs #Define Channels log.channels.chs.type = memory #Tie Source and Sink to Channel log.sinks