std::string
provides a max_size()
method to determine the maximum number of elements it can contain.
However, to work out the maximum lengt
The call to max_size()
is delegated to the allocator used for the container.
In theory, a very smart allocator could compute its max_size
in runtime, e.g. depending on RAM available.
One reason is that the max_size
function isn't very useful at all, and the committee doesn't think it is worth the trouble to try to fix it. So it is just left the way it is, because it is part of the documented interface.
See library defect report #197:
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2012/n3440.html#197
max_size() isn't useful for very many things, and the existing wording is sufficiently clear for the few cases that max_size() can be used for. None of the attempts to change the existing wording were an improvement.
This should also work:
enum : std::string::size_type {
npos = std::string::size_type(-1),
max_size = npos - 1
};
std::string::max_size()
calls std::allocator::max_size()
under the hood.
According to the standard, 20.9.6.1.10:
size_type max_size() const noexcept;
Returns: The largest value N for which the call allocate(N,0) might succeed.
(See also: allocator::max_size)
Theoretically, an allocator implementation could be able to work out the maximum size of a chunk of memory it could allocate via a syscall. This would help determine the largest possible size for a string, inside a specific process.