Viewing the number of blocks for a file in hadoop

前端 未结 4 745
暖寄归人
暖寄归人 2020-12-15 05:53

How can I view how many blocks has a file been broken into, in a Hadoop file system?

相关标签:
4条回答
  • 2020-12-15 06:30

    We can use hadoop file system check command to know the blocks for the specific file.

    Below is the command:

    hadoop fsck [path] [options]
    

    To view the blocks for the specific file :

    hadoop fsck /path/to/file -files -blocks
    
    0 讨论(0)
  • 2020-12-15 06:30

    hadoop fsck filetopath

    used the above commad in CDH 5. Got the below Error.

    hadoop-hdfs/bin/hdfs: line 262: exec: : not found

    Use the below command and it worked good

    hdfs fsck filetopath

    0 讨论(0)
  • 2020-12-15 06:34

    It is always a good idea to use hdfs instead of hadoop as 'hadoop' version is deprecated.

    Here is the command with hdfs and to find the details on a file named 'test.txt' in the root, you would write

    hdfs fsck /test.txt -files -blocks -locations

    0 讨论(0)
  • 2020-12-15 06:44

    This should work..

    hadoop fs -stat "%o" /path/to/file
    
    0 讨论(0)
提交回复
热议问题