问题
I have 30 instances of a process running on a server and want to log open files for each process for analysis.
I ran the following command:
* ps auwwx | grep PROG_NAME | awk '{print $2}' | xargs lsof -p | less
It complaints that, "lsof: status error on : No such file or directory"
However, if I run lsof -p < pid >
it gives me the list of open files for that process . How can I get a list of all open files for all 30 instances of the process on a FreeBSD machine.
Moreover, I do not want the shared libraries to be listed. If I do -d "^txt"
it isn't showing some other db files which I want to be shown. Is there any other way to grep out the .so files?
回答1:
The lsof -p
option takes a comma-separated list of PIDs. The way you're using xargs
will pass the pids as separate arguments leading some to be interpreted as filenames.
Try lsof -p $(your grep | tr '\012' ,)
That's going to have a trailing comma, I'm not sure if lsof
will care but you could sed
it off if necessary.
回答2:
You can use xargs -L1 lsof -p
to run lsof
once per pid.
Even better: use lsof -c
to list all open files from commands matching a specific pattern:
lsof -c bas # list all processes with commands starting with 'bas'
lsof -c '/ash$/x' # list all commands ending with 'ash' (regexp syntax)
来源:https://stackoverflow.com/questions/12468629/lsof-should-give-all-open-files-for-a-set-of-pids