Saving dataframe to local file system results in empty results

后端 未结 2 743
南方客
南方客 2020-12-02 01:39

We are running spark 2.3.0 on AWS EMR. The following DataFrame \"df\" is non empty and of modest size:

scala> df.co         


        
相关标签:
2条回答
  • 2020-12-02 02:23

    This error usually occurs when you try to read an empty directory as parquet. You could check 1. if the DataFrame is empty with outcome.rdd.isEmpty() before write it. 2. Check the if the path you are giving is correct

    Also in what mode you are running your application? Try running it in client mode if you are running in cluster mode.

    0 讨论(0)
  • 2020-12-02 02:29

    That is not a bug and it is the expected behavior. Spark does not really support writes to non-distributed storage (it will work in local mode, just because you have shared file system).

    Local path is not interpreted (only) as a path on the driver (this would require collecting the data) but local path on each executor. Therefore each executor will write its own chunk to its own local file system.

    Not only output is no readable back (to load data each executor and the driver should see the same state of the file system), but depending on the commit algorithm, might not be even finalized (move from the temporary directory).

    0 讨论(0)
提交回复
热议问题