“head” command for aws s3 to view file contents

前端 未结 8 1290
不知归路
不知归路 2021-02-06 22:44

On Linux, we generally use the head/tail commands to preview the contents of a file. It helps in viewing a part of the file (to inspect the format for instance), rather than ope

相关标签:
8条回答
  • 2021-02-06 23:39

    One thing you could do is cp the object to stout and then pipe it to head:

    aws s3 cp s3://path/to/my/object - | head
    

    You get a broken pipe error at the end but it works.

    0 讨论(0)
  • If you are using s3cmd, you can use the s3cmd get and write to stdout and pipe it to head as follows:

    s3cmd get s3://bucket/file - | head
    

    If you want to view the head of a gzip file, pipe stdout to gzip -d - and to head:

    s3cmd get s3://bucket/file - | gzip -d - | head
    

    If you get bored with this piping business, add the following script to your ~/.bashrc

    function s3head {
        s3_path=${@:$#}
        params=${@:1:$# - 1}
        s3cmd get $s3_path - | zcat -f | head $params
    }
    

    Now source the ~/.bashrc file.

    Simply running s3head s3://bucket/file will give you the first 10 line of your file.

    This even supports other head command parameters.

    For example, If you want more line, just specify -n followed by the number of lines as follows:

    # Prints the first 14 lines of s3://bucket/file
    s3head -n 14 s3://bucket/file
    

    Here are some other utility scripts for s3: https://github.com/aswathkk/dotfiles/blob/master/util_scripts/s3utils.sh

    0 讨论(0)
提交回复
热议问题