I am running a simulation study and need to process and save the results from several text files. I have the data organized in such a way where there are sub directories and wit
You can use Perl's glob ()
function to get a list of files and send it to R using, e.g., RSPerl's interface.
filenames <- list.files("path/to/files", recursive=TRUE)
This will give you all the files residing under one folder and sub folders under it.
If you need to run the same analysis on each of the files, then you can access them in one shot using list.files(recursive = T)
. This is assuming that you have already set your working directory to Data Folder
. The recursive = T
lists all files within subdirectories as well.
I'm not near a computer with R right now, but read the help for file-related functions:
The dir
function will list the files and directories. It has a recursive argument.
list.files
is an alias for dir
. The file.info
function will tell you (among other things) if a path is a directory and file.path
will combine path parts.
The basename
and dirname
functions might also be useful.
Note that all these functions are vectorized.
EDIT Now at a computer, so here's an example:
# Make a function to process each file
processFile <- function(f) {
df <- read.csv(f)
# ...and do stuff...
file.info(f)$size # dummy result
}
# Find all .csv files
files <- dir("/foo/bar/", recursive=TRUE, full.names=TRUE, pattern="\\.csv$")
# Apply the function to all files.
result <- sapply(files, processFile)