How can I read multiple files from multiple directories into R for processing?

后端 未结 4 1754
广开言路
广开言路 2021-02-09 04:10

I am running a simulation study and need to process and save the results from several text files. I have the data organized in such a way where there are sub directories and wit

相关标签:
4条回答
  • 2021-02-09 04:42

    You can use Perl's glob () function to get a list of files and send it to R using, e.g., RSPerl's interface.

    0 讨论(0)
  • 2021-02-09 04:43

    filenames <- list.files("path/to/files", recursive=TRUE) This will give you all the files residing under one folder and sub folders under it.

    0 讨论(0)
  • 2021-02-09 04:44

    If you need to run the same analysis on each of the files, then you can access them in one shot using list.files(recursive = T). This is assuming that you have already set your working directory to Data Folder. The recursive = T lists all files within subdirectories as well.

    0 讨论(0)
  • 2021-02-09 05:07

    I'm not near a computer with R right now, but read the help for file-related functions:

    The dir function will list the files and directories. It has a recursive argument. list.files is an alias for dir. The file.info function will tell you (among other things) if a path is a directory and file.path will combine path parts.

    The basename and dirname functions might also be useful.

    Note that all these functions are vectorized.

    EDIT Now at a computer, so here's an example:

    # Make a function to process each file
    processFile <- function(f) {
      df <- read.csv(f)
      # ...and do stuff...
      file.info(f)$size # dummy result
    }
    
    # Find all .csv files
    files <- dir("/foo/bar/", recursive=TRUE, full.names=TRUE, pattern="\\.csv$")
    
    # Apply the function to all files.
    result <- sapply(files, processFile)
    
    0 讨论(0)
提交回复
热议问题