Concatenating multiple text files into a single file in Bash

前端 未结 12 1562
眼角桃花
眼角桃花 2020-11-29 15:31

What is the quickest and most pragmatic way to combine all *.txt file in a directory into one large text file?

Currently I\'m using windows with cygwin so I have acc

相关标签:
12条回答
  • 2020-11-29 15:51

    all of that is nasty....

    ls | grep *.txt | while read file; do cat $file >> ./output.txt; done;
    

    easy stuff.

    0 讨论(0)
  • 2020-11-29 15:59

    The most upvoted answers will fail if the file list is too long.

    A more portable solution would be using fd

    fd -e txt -d 1 -X awk 1 > combined.txt
    

    -d 1 limits the search to the current directory. If you omit this option then it will recursively find all .txt files from the current directory.
    -X (otherwise known as --exec-batch) executes a command (awk 1 in this case) for all the search results at once.

    0 讨论(0)
  • 2020-11-29 16:01

    How about this approach?

    find . -type f -name '*.txt' -exec cat {} + >> output.txt
    
    0 讨论(0)
  • 2020-11-29 16:03

    You can use Windows shell copy to concatenate files.

    C:\> copy *.txt outputfile
    

    From the help:

    To append files, specify a single file for destination, but multiple files for source (using wildcards or file1+file2+file3 format).

    0 讨论(0)
  • 2020-11-29 16:04

    Just remember, for all the solutions given so far, the shell decides the order in which the files are concatenated. For Bash, IIRC, that's alphabetical order. If the order is important, you should either name the files appropriately (01file.txt, 02file.txt, etc...) or specify each file in the order you want it concatenated.

    $ cat file1 file2 file3 file4 file5 file6 > out.txt
    
    0 讨论(0)
  • 2020-11-29 16:04

    The Windows shell command type can do this:

    type *.txt >outputfile
    

    Type type command also writes file names to stderr, which are not captured by the > redirect operator (but will show up on the console).

    0 讨论(0)
提交回复
热议问题