I have a weird problem with vector in C++..
I created a vector and inserted 10000 integer values into it and have checked the memory utilization. It is 600 kb. But after
Presumably you checked memory with some system utility. Even if the vector space is freed on the heap that does not mean the heap space itself is going to be returned to the OS and reflected in the values shown in the system utility.
Try using the erase-remove idiom. If you are just erasing all of the elements from the vector then all you are doing is moving the end() iterator. So everything is still there in the vector but "unavailable".
http://en.wikipedia.org/wiki/Erase-remove_idiom
Try deleting each element and then deleting the vector itself - see if that makes any difference. Have you tried using valgrind to find out if there are any memory leaks?
The C++ vector reserves more memory than it needs for its elements to speed up adding new elements and it doesn't free the reserved memory, after the elements have been deleted.
You can try swapping the vector with itself, to make the amount of reserved memory match the actual size of all the elements: v.swap(v)
The following code gives the answer:
#include <cstdio>
#include <vector>
struct Foo
{
~Foo()
{
printf("~Foo()\n");
}
};
int main()
{
std::vector<Foo> v;
v.push_back(Foo());
// This causes all Foo instances to be released as well as all
// space allocated by the vector to be released.
v = std::vector<Foo>();
// As you can see, all printf("~Foo()\n") calls happen before this
// line is printed.
printf("~~~~~~~~~~~~~~\n");
return 0;
}
The only way to really get rid off unused memory in a std::vector<>
pre C++11 is to swap it with an empty vector: vector<int>().swap(myvec)
. In C++11 you have a member function shrink_to_fit
which often is implemented as the swap idiom just mentioned.