How to overwrite/reuse the existing output path for Hadoop jobs again and agian

后端 未结 10 872
既然无缘
既然无缘 2021-02-12 10:29

I want to overwrite/reuse the existing output directory when I run my Hadoop job daily. Actually the output directory will store summarized output of each day\'s job run results

10条回答
  •  长发绾君心
    2021-02-12 11:22

    If one is loading the input file (with e.g., appended entries) from the local file system to hadoop distributed file system as such:

    hdfs dfs -put  /mylocalfile /user/cloudera/purchase
    

    Then one could also overwrite/reuse the existing output directory with -f. No need to delete or re-create the folder

    hdfs dfs -put -f  /updated_mylocalfile /user/cloudera/purchase
    

提交回复
热议问题