I am getting on this command
Dictionary<UInt64, int> myIntDict = new Dictionary<UInt64, int>(89478458);
this error:
System.OutOfMemoryException was unhandled HResult=-2147024882
Message=Array dimensions exceeded supported range.
Source=mscorlib
StackTrace:
at System.Collections.Generic.Dictionary`2.Initialize(Int32 capacity)
at System.Collections.Generic.Dictionary`2..ctor(Int32 capacity, IEqualityComparer`1 comparer)
On 89478457 there is no error. Here is the source of Initialize in Dictionary.cs:
private void Initialize(int capacity)
{
int size = HashHelpers.GetPrime(capacity);
...
entries = new Entry[size];
...
}
When I reproduce this, the error happens on the array creation. Entry is a struct in this case with size 24. When we get max int32 (0x80000000-1) and divide on 24 = 89478485 and this number is between prime numbers 89478457 and 89478503.
Does this mean, that array of struct cannot be bigger as maxInt32/sizeOfThisStruct?
EDIT:
Yes. I actually go over 2 GB. This happens, when the dictionary creates the internal array of struct Entry, where are the (key,value) pairs stored. In my case the sizeof(Entry) is 24 bytes and as value type is inline allocated.
And the solution is to use the gcAllowVeryLargeObjects flag (thank you Evk). Actually in .net core the flag is the environment variable COMPlus_gcAllowVeryLargeObjects (thank you svick).
And yes, Paparazzi is right. I have to think about, how not to waste memory. Thank you all.
There is known limitation of .NET runtime - maximum object size allowed on the heap is 2 GB, even on 64-bit version of runtime. But, starting from .NET 4.5 there is configuration option which allows you to relax this limit (only on 64-bit version of runtime still) and create larger arrays. Example of configuration to enable that is:
<configuration>
<runtime>
<gcAllowVeryLargeObjects enabled="true" />
</runtime>
</configuration>
On the surface Dictionary does not make sense
You can only have int unique values
Do you really have that my duplicates
UnInt32 goes to 4,294,967,295
Why are you wasting 4 bytes?
89,478,458 rows
Currently a row is 12 bytes
You have 1 GB at about 83,333,333 rows
Since an object needs contiguous memory 1 GB is more of a practical limit
If values is really a strut 24
Then 1 gb in 31,250,000
That is just a really big collection
You can split is up into more than one collection
Or use a class as then it is just a reference with I think is 4 bytes
来源:https://stackoverflow.com/questions/37530383/error-when-dictionary-count-is-bigger-as-89478457