Spark SQL - How to write DataFrame to text file?

后端 未结 2 1679
借酒劲吻你
借酒劲吻你 2021-02-08 05:52

I am using Spark SQL for reading parquet and writing parquet file.

But some cases,i need to write the DataFrame as text file instead of Json or

相关标签:
2条回答
  • 2021-02-08 06:24
    df.repartition(1).write.option("header", "true").csv("filename.csv")
    
    0 讨论(0)
  • 2021-02-08 06:28

    Using Databricks Spark-CSV you can save directly to a CSV file and load from a CSV file afterwards like this

    import org.apache.spark.sql.SQLContext
    
    SQLContext sqlContext = new SQLContext(sc);
    DataFrame df = sqlContext.read()
        .format("com.databricks.spark.csv")
        .option("inferSchema", "true")
        .option("header", "true")
        .load("cars.csv");
    
    df.select("year", "model").write()
        .format("com.databricks.spark.csv")
        .option("header", "true")
        .option("codec", "org.apache.hadoop.io.compress.GzipCodec")
        .save("newcars.csv");
    
    0 讨论(0)
提交回复
热议问题