Can you call Directory.GetFiles() with multiple filters?

前端 未结 26 2430
逝去的感伤
逝去的感伤 2020-11-22 05:25

I am trying to use the Directory.GetFiles() method to retrieve a list of files of multiple types, such as mp3\'s and jpg\'s. I have t

相关标签:
26条回答
  • 2020-11-22 05:46

    Another way to use Linq, but without having to return everything and filter on that in memory.

    var files = Directory.GetFiles("C:\\path", "*.mp3", SearchOption.AllDirectories).Union(Directory.GetFiles("C:\\path", "*.jpg", SearchOption.AllDirectories));
    

    It's actually 2 calls to GetFiles(), but I think it's consistent with the spirit of the question and returns them in one enumerable.

    0 讨论(0)
  • 2020-11-22 05:47

    How about this:

    private static string[] GetFiles(string sourceFolder, string filters, System.IO.SearchOption searchOption)
    {
       return filters.Split('|').SelectMany(filter => System.IO.Directory.GetFiles(sourceFolder, filter, searchOption)).ToArray();
    }
    

    I found it here (in the comments): http://msdn.microsoft.com/en-us/library/wz42302f.aspx

    0 讨论(0)
  • 2020-11-22 05:48

    Here is a simple and elegant way of getting filtered files

    var allowedFileExtensions = ".csv,.txt";
    
    
    var files = Directory.EnumerateFiles(@"C:\MyFolder", "*.*", SearchOption.TopDirectoryOnly)
                    .Where(s => allowedFileExtensions.IndexOf(Path.GetExtension(s)) > -1).ToArray(); 
    
    0 讨论(0)
  • 2020-11-22 05:51

    I wonder why there are so many "solutions" posted?

    If my rookie-understanding on how GetFiles works is right, there are only two options and any of the solutions above can be brought down to these:

    1. GetFiles, then filter: Fast, but a memory killer due to storing overhead untill the filters are applied

    2. Filter while GetFiles: Slower the more filters are set, but low memory usage as no overhead is stored.
      This is explained in one of the above posts with an impressive benchmark: Each filter option causes a seperate GetFile-operation so the same part of the harddrive gets read several times.

    In my opinion Option 1) is better, but using the SearchOption.AllDirectories on folders like C:\ would use huge amounts of memory.
    Therefor i would just make a recursive sub-method that goes through all subfolders using option 1)

    This should cause only 1 GetFiles-operation on each folder and therefor be fast (Option 1), but use only a small amount of memory as the filters are applied afters each subfolders' reading -> overhead is deleted after each subfolder.

    Please correct me if I am wrong. I am as i said quite new to programming but want to gain deeper understanding of things to eventually become good at this :)

    0 讨论(0)
  • 2020-11-22 05:51

    Or you can just convert the string of extensions to String^

    vector <string>  extensions = { "*.mp4", "*.avi", "*.flv" };
    for (int i = 0; i < extensions.size(); ++i)
    {
         String^ ext = gcnew String(extensions[i].c_str());;
         String^ path = "C:\\Users\\Eric\\Videos";
         array<String^>^files = Directory::GetFiles(path,ext);
         Console::WriteLine(ext);
         cout << " " << (files->Length) << endl;
    }
    
    0 讨论(0)
  • 2020-11-22 05:52

    What about

    string[] filesPNG = Directory.GetFiles(path, "*.png");
    string[] filesJPG = Directory.GetFiles(path, "*.jpg");
    string[] filesJPEG = Directory.GetFiles(path, "*.jpeg");
    
    int totalArraySizeAll = filesPNG.Length + filesJPG.Length + filesJPEG.Length;
    List<string> filesAll = new List<string>(totalArraySizeAll);
    filesAll.AddRange(filesPNG);
    filesAll.AddRange(filesJPG);
    filesAll.AddRange(filesJPEG);
    
    0 讨论(0)
提交回复
热议问题