Converting pipe-delimited file to spark dataframe to CSV file

送分小仙女□ 提交于 2019-12-24 06:44:34

问题


I have a CSV file with one single column and the rows are defined as follows :

123 || food || fruit
123 || food || fruit || orange 
123 || food || fruit || apple

I want to create a csv file with a single column and distinct row values as :

orange
apple

I tried using the following code :

 val data = sc.textFile("fruits.csv")
 val rows = data.map(_.split("||"))
 val rddnew = rows.flatMap( arr => {
 val text = arr(0) 
 val words = text.split("||")
 words.map( word => ( word, text ) )
 } )

But this code is not giving me the correct result as wanted.
Can anyone please help me with this ?


回答1:


you need to split with escape for special characters, since split takes regex

.split("\\|\\|")

converting to CSV is tricky because data strings may potentially contain your delimiter (in quotes), new-line or other parse-sensitive characters, so I'd recommend using spark-csv

 val df = sqlContext.read
  .format("com.databricks.spark.csv")
  .option("delimiter", "||")
  .option("header", "true")
  .option("inferSchema", "true")
  .load("words.csv")

and

 words.write
  .format("com.databricks.spark.csv")
  .option("delimiter", "||")
  .option("header", "true")
  .save("words.csv")



回答2:


you can solve this problem similar to this code

val text = sc.textFile("fruit.csv")
val word = text.map( l => l.split("\\|\\|")
val last = word.map( w => w(w.size - 1))
last.distinct.collect


来源:https://stackoverflow.com/questions/36948440/converting-pipe-delimited-file-to-spark-dataframe-to-csv-file

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!