问题
I am running find . -exec stat -f '%N,%a' {} +
in a directory with > 1 million files. It just hangs and uses CPU. I left it running for over 5 minutes and still no output. I don't want it to crash my computer.
What other ways are there to read the directory files in a way that doesn't load the whole thing into memory.
I can't change the directory structure, I need it to have these > 1 million files in a flat directory.
来源:https://stackoverflow.com/questions/56204321/how-to-efficiently-read-file-names-when-there-are-1-million-files-in-directory