Bash script does not continue to read the next line of file

前端 未结 5 559
南笙
南笙 2021-02-06 17:27

I have a shell script that saves the output of a command that is executed to a CSV file. It reads the command it has to execute from a shell script which is in this format:

5条回答
  •  北恋
    北恋 (楼主)
    2021-02-06 17:58

    Unless you are planning to read something from standard input after the loop, you don't need to preserve and restore the original standard input (though it is good to see you know how).

    Similarly, I don't see a reason for dinking with IFS at all. There is certainly no need to restore the value of IFS before exit - this is a real shell you are using, not a DOS BAT file.

    When you do:

    read var1 var2 var3
    

    the shell assigns the first field to $var1, the second to $var2, and the rest of the line to $var3. In the case where there's just one variable - your script, for example - the whole line goes into the variable, just as you want it to.

    Inside the process line function, you probably don't want to throw away error output from the executed command. You probably do want to think about checking the exit status of the command. The echo with error redirection is ... unusual, and overkill. If you're sufficiently sure that the commands can't fail, then go ahead with ignoring the error. Is the command 'chatty'; if so, throw away the chat by all means. If not, maybe you don't need to throw away standard output, either.

    The script as a whole should probably diagnose when it is given multiple files to process since it ignores the extraneous ones.

    You could simplify your file handling by using just:

    cat "$@" |
    while read line
    do
        processline "$line"
    done
    

    The cat command automatically reports errors (and continues after them) and processes all the input files, or reads standard input if there are no arguments left. The use of double quotes around the variable means that it is passed as a single unit (and therefore unparsed into separate words).

    The use of date and bc is interesting - I'd not seen that before.

    All in all, I'd be looking at something like:

    #!/bin/bash
    # Time execution of commands read from a file, line by line.
    # Log commands and times to CSV logfile "file.csv"
    
    processLine(){
        START=$(date +%s.%N)
        eval "$@" > /dev/null
        STATUS=$?
        END=$(date +%s.%N)
        DIFF=$(echo "$END - $START" | bc)
        echo "$line, $START, $END, $DIFF, $STATUS" >> file.csv
        echo "${DIFF}s: $STATUS: $line"
    }
    
    cat "$@" |
    while read line
    do
        processLine "$line"
    done
    

提交回复
热议问题