Bash: Terminate on Timeout/File Overflow while Executing Command

后端 未结 3 1172
既然无缘
既然无缘 2021-01-28 02:57

I\'m writing a mock-grading script in bash. It\'s supposed to execute a C program which will give some output (which I redirect to a file.) I\'m trying to (1) make it timeout af

相关标签:
3条回答
  • 2021-01-28 03:38

    This starts yourcommand, redirecting output via dd to youroutputfile and putting a limit of 10000000 bytes on it: dd will terminate and SIGPIPE will be sent to yourcommand

    yourcommand | dd of=youroutputfile bs=1 count=10000000 &
    

    This will wait 5 seconds and kill yourcommand if not already terminated:

    sleep 5
    kill %yourcommand
    
    0 讨论(0)
  • 2021-01-28 03:44

    There's a GNU coreutil command timeout to do timeouts.

    Investigate ulimit -f 32 to set the maximum file size (to 16 KiB; it counts in 512 byte blocks).

    Objection:

    ulimit is [not] suitable because I have to create other files as well. I need to limit only one of them.

    Counter: Unless the program must create a big file and a little file and you have to limit just the little file, you can use a sub-shell to good effect:

    (
    ulimit -f 32
    timeout 10m -- command arg >file
    )
    

    The limit on file size is restricted to the commands in the sub-shell (which is marked by the pair of parentheses).

    0 讨论(0)
  • 2021-01-28 03:47

    you can use timeout command eg

    timeout -s 9 5s ./c_program > file
    

    to check file size, you can stat the file, then do if/else

    limit=1234 #bytes
    size=$(stat -c "%s" file)
    if [ "$size"  -gt "$limit" ] ;then
      exit
    fi
    

    see also here if you can't use these GNU tools, or here for some other inspirations.

    0 讨论(0)
提交回复
热议问题