How to overwrite the existing files using hadoop fs -copyToLocal command

前端 未结 8 1161
小蘑菇
小蘑菇 2021-02-01 13:08

Is there any way we can overwrite existing files, while coping from HDFS using:

hadoop fs -copyToLocal  
         


        
相关标签:
8条回答
  • 2021-02-01 13:13

    Force option is not there for either of the commands (get /copytolocal).

    Below are three options:

    1. Remove the file on localmachine with rm command and use copyToLocal/get.

    2. Rename your local file to new name so that you can have the file with same name as on cluster. use mv command for that and use get/copyTolocal command.

    3. Rename the file there on the cluster itself and use copytolocal

      hadoop fs -mv [oldpath] [newpath]
      hadoop fs -copytolocal [newpath] .
      
    0 讨论(0)
  • 2021-02-01 13:13

    You can try with distcp with -update . Main advantage is it will be update the target only when there is change in the file.

    hadoop distcp -update file://source hdfs://namenode/target

    hadoop distcp -update  file:///home/hduser/pigSample/labfiles/SampleData/books.csv  hdfs://10.184.37.158:9000/yesB
    
    0 讨论(0)
  • 2021-02-01 13:16

    -f option did the trick

    example:

    bin>hdfs dfs -put -f D:\DEV\hadoopsampledata\mydata.json /input
    
    0 讨论(0)
  • 2021-02-01 13:18
    fs -copyFromLocal -f $LOCAL_MOUNT_SRC_PATH/yourfilename.txt your_hdfs_file-path
    

    So -f option does the trick for you.

    It also works for -copyToLocal as well.

    0 讨论(0)
  • 2021-02-01 13:29

    I used the command below and it helped:

    hadoop fs -put -f <<local path>> <<hdfs>>
    

    but from put docs:

    Copy single src, or multiple srcs from local file system to the destination file system.

    0 讨论(0)
  • 2021-02-01 13:31

    The -f work me.

    hdfs dfs -copyFromLocal -f [LOCALFILEPATH] [HDFSFILEPAHT]

    0 讨论(0)
提交回复
热议问题