When using vector, \"Out of Memory\" show.
To fix it, I use max_size() to check, then reserve or push_back.
If the max_size() is bigger the reserved value, it should
The problem is that vector tries to allocate a contiguous block of memory, which might not be available at that time, even though the total available memory may be much larger.
I would suggest to use std::deque
as it does not require to allocate a contiguous block of memory.
vector::capacity() gives the maximum number of elements that can be stored in the vector without having a re-allocation, one which can potentially fail from std::bad_alloc
.
vector::max_size() has a different meaning, roughly similar to (INT_MAX / sizeof(element))
.
For more information on Windows memory management, see the MSDN article
max_size()
returns the maximum number of elements a vector can possibly hold. That is, the absolute limit when taking into account things like the addressing limits using the integral types it might store and the address space limits of the operating system.
This doesn't mean you can actually make a vector hold that many elements. It just means you can never store more. Also just because you have 4 gigs of RAM doesn't mean you can actually create a single contiguous buffer that occupies 4 gigs of RAM or anywhere close. There are other factors to consider like memory fragmentation (you might only be able to page a one gig memory block into physical memory due to it).
If you really need this many elements in a container, a contiguous sequence is probably not a good choice. For data sets that large, you may need something that can be paged in bits and pieces like std::deque.
max_size()
tells you the design limit of the class, but memory shortage can limit the real size to something smaller. There's not generally any way to find what the lower limit might be though (e.g., it might change from one moment to another, depending on how much memory is used by other programs).