How to write super-fast file-streaming code in C#?

后端 未结 9 1859
遥遥无期
遥遥无期 2020-11-28 19:11

I have to split a huge file into many smaller files. Each of the destination files is defined by an offset and length as the number of bytes. I\'m using the following code:<

相关标签:
9条回答
  • 2020-11-28 20:01

    (For future reference.)

    Quite possibly the fastest way to do this would be to use memory mapped files (so primarily copying memory, and the OS handling the file reads/writes via its paging/memory management).

    Memory Mapped files are supported in managed code in .NET 4.0.

    But as noted, you need to profile, and expect to switch to native code for maximum performance.

    0 讨论(0)
  • 2020-11-28 20:06

    Have you considered using the CCR since you are writing to separate files you can do everything in parallel (read and write) and the CCR makes it very easy to do this.

    static void Main(string[] args)
        {
            Dispatcher dp = new Dispatcher();
            DispatcherQueue dq = new DispatcherQueue("DQ", dp);
    
            Port<long> offsetPort = new Port<long>();
    
            Arbiter.Activate(dq, Arbiter.Receive<long>(true, offsetPort,
                new Handler<long>(Split)));
    
            FileStream fs = File.Open(file_path, FileMode.Open);
            long size = fs.Length;
            fs.Dispose();
    
            for (long i = 0; i < size; i += split_size)
            {
                offsetPort.Post(i);
            }
        }
    
        private static void Split(long offset)
        {
            FileStream reader = new FileStream(file_path, FileMode.Open, 
                FileAccess.Read);
            reader.Seek(offset, SeekOrigin.Begin);
            long toRead = 0;
            if (offset + split_size <= reader.Length)
                toRead = split_size;
            else
                toRead = reader.Length - offset;
    
            byte[] buff = new byte[toRead];
            reader.Read(buff, 0, (int)toRead);
            reader.Dispose();
            File.WriteAllBytes("c:\\out" + offset + ".txt", buff);
        }
    

    This code posts offsets to a CCR port which causes a Thread to be created to execute the code in the Split method. This causes you to open the file multiple times but gets rid of the need for synchronization. You can make it more memory efficient but you'll have to sacrifice speed.

    0 讨论(0)
  • 2020-11-28 20:12

    The first thing I would recommend is to take measurements. Where are you losing your time? Is it in the read, or the write?

    Over 100,000 accesses (sum the times): How much time is spent allocating the buffer array? How much time is spent opening the file for read (is it the same file every time?) How much time is spent in read and write operations?

    If you aren't doing any type of transformation on the file, do you need a BinaryWriter, or can you use a filestream for writes? (try it, do you get identical output? does it save time?)

    0 讨论(0)
提交回复
热议问题