Heap fragmentation when using byte arrays

前端 未结 3 1886
闹比i
闹比i 2021-02-05 10:54

I have a C# 4.0 application (single producer/single consumer) which transfers huge amount of data in chunks. Although there\'s no new memory allocation I run out of memory after

相关标签:
3条回答
  • 2021-02-05 11:30

    You probably ran into the large object heap problem - objects larger than 85,000 bytes are put on the large object heap which is not compacted which can lead to strange out of memory situations. Although apparently the performance in .NET 4 has been improved it's far from perfect. The solution is to basically use your own buffer pool which contains a few statically allocated chunks of memory and reuse those.
    There is a whole bunch of questions around that on SO.

    Update: Microsoft provides a buffer manager as part of the WCF stack. There is also one on codeproject.

    0 讨论(0)
  • 2021-02-05 11:31

    The GC doesn’t compact the large object heap for you, you can still programmatically compact it. The following code snippet illustrates how this can be achieved.

    GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce;
    GC.Collect();
    
    0 讨论(0)
  • 2021-02-05 11:39

    How long are your byte[] array? Do they fall into the small object or large object heap? If you experience memory fragmentation, I would say they fall into the LOH.

    You should therefore reuse the same byte arrays (use a pool) or use smaller chunks. The LOH is never compacted, so it can become quite fragmented. Sadly there is no way around this. (Apart from knowing this limitation and avoiding it)

    0 讨论(0)
提交回复
热议问题