Spark SQL SaveMode.Overwrite, getting java.io.FileNotFoundException and requiring 'REFRESH TABLE tableName'

后端 未结 4 708
孤独总比滥情好
孤独总比滥情好 2020-12-08 11:37

For spark sql, how should we fetch data from one folder in HDFS, do some modifications, and save the updated data to the same folder in HDFS via Overwrite save mode<

相关标签:
4条回答
  • 2020-12-08 11:51

    I solved this , first I write my Dataframe to a temp directory , and delete the source I reading , and rename the temp directory to source name . QAQ

    0 讨论(0)
  • 2020-12-08 12:03

    I faced similar issue. I was writing dataframe to a hive table using below code

    dataframe.write.mode("overwrite").saveAsTable("mydatabase.tablename")   
    

    When I tried to query this table, I was getting the same error. I then added the below line of code after creating table to refresh the table, which solved the issue.

    spark.catalog.refreshTable("mydatabase.tablename")
    
    0 讨论(0)
  • 2020-12-08 12:07
    val dfOut = df.filter(r => r.getAs[Long]("dsctimestamp") > (System.currentTimeMillis() - 1800000))
    

    In the above line of code, df had an underlying Hadoop partition. Once I had made this transformation (i.e., to dfOut), I could not find a way to delete, rename, or overwrite the underlying partition until dfOut had been garbage collected.

    My solution was to keep the old partition, create a new partition for dfOut, flag the new partition as current and then delete the old partition some given time later, after dfOut had been garbage collected.

    May not be an ideal solution. I would love to learn a less tortuous way of addressing this issue. But it works.

    0 讨论(0)
  • 2020-12-08 12:11

    Why don't you just cache it after reading it. Saving it to another file directory and then moving the directory might entail some extra permissions. I also have been forcing an action as well, like a show().

    val myDF = spark.read.format("csv")
        .option("header", "false")
        .option("delimiter", ",")
        .load("/directory/tofile/")
    
    
    myDF.cache()
    myDF.show(2)
    
    0 讨论(0)
提交回复
热议问题