spark streaming not able to use spark sql

孤者浪人 提交于 2019-12-12 01:09:48

问题


I am facing an issue during spark streaming. I am getting empty records after it gets streamed and passes to the "parse" method.

My code:

import spark.implicits._
import org.apache.spark.sql.types._
import org.apache.spark.sql.Encoders
import org.apache.spark.streaming._
import org.apache.spark.sql.functions._
import org.apache.spark.sql.SparkSession
import spark.implicits._
import org.apache.spark.sql.types.{StructType, StructField, StringType, 
IntegerType}
import org.apache.spark.sql.functions._
import org.apache.spark.sql.SparkSession
import spark.implicits._
import org.apache.spark.sql.types.{StructType, StructField, StringType, 
IntegerType}
import org.apache.spark.SparkConf
import org.apache.spark.streaming.{Seconds, StreamingContext}
import org.apache.spark.storage.StorageLevel
import java.util.regex.Pattern
import java.util.regex.Matcher
import org.apache.spark.sql.hive.HiveContext;
import org.apache.spark.sql.streaming.Trigger
import org.apache.spark.sql._

val conf = new SparkConf().setAppName("streamHive").setMaster("local[*]").set("spark.driver.allowMultipleContexts", "true")

val ssc = new StreamingContext(conf, Seconds(5))    

val sc=ssc.sparkContext

val lines = ssc.textFileStream("file:///home/sadr/testHive")

case class Prices(name: String, age: String,sex: String, location: String)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)

def parse (rdd : org.apache.spark.rdd.RDD[String] ) = 
{
var l = rdd.map(_.split(","))
val prices = l.map(p => Prices(p(0),p(1),p(2),p(3)))
val pricesDf = sqlContext.createDataFrame(prices)
pricesDf.registerTempTable("prices")
pricesDf.show()
var x = sqlContext.sql("select count(*) from prices")
x.show()}
lines.foreachRDD { rdd => parse(rdd)} 
lines.print()
ssc.start()

My input file:

cat test1.csv

Riaz,32,M,uk
tony,23,M,india
manu,33,M,china
imart,34,F,AUS

I am getting this output:

lines.foreachRDD { rdd => parse(rdd)}

lines.print()

ssc.start()

scala> +----+---+---+--------+
|name|age|sex|location|
+----+---+---+--------+
+----+---+---+--------+

I am using Spark version 2.3....I AM GETTING FOLLOWING ERROR AFTER ADDING X.SHOW()


回答1:


Not sure if you are actually able to read the streams.

textFileStream reads only the new files added to the directory after the program starts and not the existing ones. Was the file already there? If yes, remove it from the directory, start the program and copy the file again?



来源:https://stackoverflow.com/questions/53872626/spark-streaming-not-able-to-use-spark-sql

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!