.Net Why can't I get more than 11GB of allocated memory in a x64 process?

前端 未结 4 515
隐瞒了意图╮
隐瞒了意图╮ 2020-12-13 16:18

I thought that the maximum user space for a 64bit process was 8TB, but I did a little test and the maximum I could get is 10-11GB.

Note: I don\'t need that much memo

相关标签:
4条回答
  • 2020-12-13 16:28

    It's your machine.

    I have an x64 with 8GB RAM and 12GB pagefile, and I ran your program and it topped out at 16.23GB.

    EPILOG: Then my Win7 install gradually slid into a coma as critical processes were apparently memory starved.

    EDIT: If you want to understand how Windows allocates (i.e. reserves and commits) memory, read Pushing the Limits of Windows: Physical Memory and Pushing the Limits of Windows: Virtual Memory.

    Since .Net relies on Windows to manage the memory it uses to build the GC heap, the mechanics of how Windows does this are reflected in how memory is allocated in .Net on a low level.

    0 讨论(0)
  • 2020-12-13 16:31

    It's up to 8 TB, not 8 TB. You can potentially have up to 8 TB, but you need the matching RAM/swapfile.

    0 讨论(0)
  • 2020-12-13 16:34

    I'm guessing its because you're using a List, which I believe has an internal limit.

    See what you can get if you try something like creating your own old school list:

    public class ListItem<T>
    {
        public ListItem Parent;
        public T Value;
    
        public ListItem(ListItem<T> parent, T item)
        {
            this.Parent = parent;
            this.Value = item;
        }
    }
    

    I have written almost that exact code before (only my item was an int) and run it on a machine with 32 processors and 128 GB of ram, it always crapped out at the same size no matter what, and was always something related to Int32.MaxValue, hope that helps.

    0 讨论(0)
  • 2020-12-13 16:43

    Try allocating one chunk (as opposed to a list of 1MB chunks):

    Dim p As IntPtr = System.Runtime.InteropServices.Marshal.AllocHGlobal(New System.IntPtr(24 * (1024 ^ 3)))
    

    Edit - Given your comment, that you only have 4GB of physical RAM, you really have no business allocating > ~8GB and even that is pushing it.

    Edit -

    To accept an answer I would like to know how the real maximum allocated memory is calculated if 8TB is only theoretical.

    The maximum amount of RAM you can allocate is probably equivalent to the (Page File Size - Size of Everything in RAM except that which cannot or will not be paged) + (Physical RAM Size - Size of everything that cannot or will not be paged i.e. that which is needed to keep your system going... kernel, drivers, .net stuff etc...)

    Of course the page file can grow...

    Sooner or later paging to/from disk becomes to much and your system slows to a crawl and becomes unusable.

    Read Mark Russinovich's blog:

    • Pushing the Limits of Windows: Physical Memory
    • Pushing the Limits of Windows: Virtual Memory
    • Pushing the Limits of Windows: Paged and Nonpaged Pool
    0 讨论(0)
提交回复
热议问题