Memory consumption by STL containers

前端 未结 2 2028
醉酒成梦
醉酒成梦 2020-12-06 06:06

I am working on an application in which I am planning to use couple of STL containers. The application will take certain steps if memory consumption reaches a threshold. For

相关标签:
2条回答
  • 2020-12-06 07:06

    A std::vector<element> typically takes 3 machine words in total + sizeof(element) * capacity() of memory. For typical implementations, the overhead consist of pointers to the beginning, end and current size of the vector. The elements themselves are stored in contiguous memory. capacity() typically has room for up to twice the actual number of elements.

    A std::map<element, int> typically takes about 2 machine words in total + 3 machine words per element + [ sizeof(element) +sizeof(int) ] * num_elements of memory. For typical implementations, the overhead consists of pointers to the stored elements. The elements themselves are stored in a binary tree, with pointers to its parent and two children.

    With these rules of thumb, all you need to know is the average number of characters per string and the total number of strings to know total memory consumption.

    0 讨论(0)
  • 2020-12-06 07:10

    For std::vector and std::string, capacity, rather than size, would be a better approximation. For node based containers (std::set, etc.), you'd want multiply the number of nodes (roughly the number of elements) times the size of each node. This only accurate, however, if the allocator doesn't use an optimized pool allocator for the nodes.

    If you really want to know how much memory is being used, however, a better strategy would be to replace the global operator new and operator delete, and keep track of the actual allocations. Even more accurate would be to replace malloc and free. Formally, this is not allowed, but in practice, I've never encountered an implementation where it doesn't work. On the other hand, if you replace malloc and free, you have to implement the actual memory management yourself. If you replace operator new and operator delete, you can use malloc and free, which makes it fairly trivial.

    Note too that each allocation has some fixed overhead. A 100000 allocations of 10 bytes each will consume significantly more memory than 10 allocations of 100000 bytes each.

    0 讨论(0)
提交回复
热议问题