HDFS: How do you list files recursively?

前端 未结 3 750
梦毁少年i
梦毁少年i 2021-02-12 11:26

How do you, through Java, list all files (recursively) under a certain path in HDFS. I went through the API and noticed FileSystem.listFiles(Path,boolean) but it looks like that

相关标签:
3条回答
  • 2021-02-12 12:08

    You can look at the source for org.apache.hadoop.fs.FsShell.ls(FileStatus, FileSystem, boolean, boolean) for your version of hadoop - this is what is called when you perform a hadoop fs -lsr path from the command line

    • 0.20.2 - line 593
    • 1.0.2 - line 590
    0 讨论(0)
  • 2021-02-12 12:22

    Use -R followed by ls command to list files/directorires recursively.

    hadoop fs -ls -R Path/Of/File
    

    Possible attributes for ls command are

    -d : Directories are listed as plain files.

    -h "Formats the sizes of files in a human-readable fashion rather than a number of bytes.

    -R "Recursively list the contents of directories.

    0 讨论(0)
  • 2021-02-12 12:26
    hadoop-user@hadoop-desk ~/hadoop
    $ bin/hadoop fs -lsr /user/someone_else/myfiles
    
    -rw-r--r--   1 hadoop-user supergroup          0 2013-11-26 02:09 /user/someone_else/myfiles/file1.txt
    
    -rw-r--r--   1 hadoop-user supergroup          0 2013-11-26 02:09 /user/someone_else/myfiles/file2.txt
    
    drwxr-xr-x   - hadoop-user supergroup          0 2013-11-26 02:09 /user/someone_else/myfiles/subdir
    
    -rw-r--r--   1 hadoop-user supergroup          0 2013-11-26 02:09 /user/someone_else/myfiles/subdir/anotherFile.txt
    
    0 讨论(0)
提交回复
热议问题