Escape quotes is not working in spark 2.2.0 while reading csv

断了今生、忘了曾经 提交于 2021-02-07 10:34:18

问题


I am trying to read my delimited file which is tab separated but not able to read all records.

Here is my input records:

head1   head2   head3
a   b   c
a2  a3  a4
a1  "b1 "c1

My code:

var inputDf = sparkSession.read
                  .option("delimiter","\t")
                  .option("header", "true")
//                  .option("inferSchema", "true")
                  .option("nullValue", "")
                  .option("escape","\"")
                  .option("multiLine", true)
                  .option("nullValue", null)
                  .option("nullValue", "NULL")
                  .schema(finalSchema)
                  .csv("file:///C:/Users/prhasija/Desktop/retriedAddresses_4.txt")
//                  .csv(inputPath)
                  .na.fill("")
//                  .repartition(4)

                  println(inputDf.count)

Output:

2 records

Why it is not returning 3 as count?


回答1:


I think you need to add the following options to your read: .option("escape", "\\") and .option("quote", "\\")

val test = spark.read
    .option("header", true)
    .option("quote", "\\")
    .option("escape", "\\")
    .option("delimiter", ",")
    .csv(".../test.csv")

Here is the test csv I used it on:

a,b,c
1,b,a
5,d,e
5,"a,"f

Full output:

scala> val test = spark.read.option("header", true).option("quote", "\\").option("escape", "\\").option("delimiter", ",").csv("./test.csv")
test: org.apache.spark.sql.DataFrame = [a: string, b: string ... 1 more field]

scala> test.show
+---+---+---+
|  a|  b|  c|
+---+---+---+
|  1|  b|  a|
|  5|  d|  e|
|  5| "a| "f|
+---+---+---+


scala> test.count
res11: Long = 3


来源:https://stackoverflow.com/questions/52995878/escape-quotes-is-not-working-in-spark-2-2-0-while-reading-csv

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!