I\'m making a game and I have a vector of bullets flying around. When the bullet is finished, I do bullets.erase(bullets.begin() + i); Then the bullet disappears. However it doe
It may not get rid of the memory.
But next time you need to add a bullit it does not need to re-allocate more space.
It will not re-use the memory that the erased bullet came from.
Note:
If you are erasing from the middle of a container relatively often then vector may not be the correct container. This is becuase if you remove element n then all the elements from [n+1,end) must be moved down one space in memory.
When the bullet is finished, I do bullets.erase(bullets.begin() + i);
Don't do that. If multiple bullets are finished per frame, you get horrible performance, because the bullets that aren't finished will get copied over and over, which really isn't necessary. Here is what I would do:
#include <algorithm>
#include <functional>
#include <vector>
class Bullet
{
// ...
public:
bool is_finished() const;
};
int main()
{
std::vector<Bullet> bullets;
// ...
bullets.erase(
std::remove_if(
bullets.begin(),
bullets.end(),
std::mem_fun_ref(&Bullet::is_finished)
),
bullets.end()
);
}
This approach only moves each live bullet at most once.
That is how normally vector's memory allocation model behaves to provide an amortized constant time push_back
operation, Basically it tries to guess that you might want to fill the erased part with a new element so it doesn't free the memory. By doing this it can avoid constant allocation and deallocations. To go around this, you can use the swap trick to free the unused vector memory. You have to swap your empty vector with a temporary unnamed vector so that when temporary vector goes out of scope it frees the memory in its destructor, Something like: vector<int>(c).swap(c)
First, std::vector erase method is not very effecient, it has to move all items after the deleted one. If order of vector items (bullets) does not matter, swapping the deleted bullet with the last bullet and deleting the last bullet will be faster (so you get constant complexity instead of linear complexity).
Second, what is the real problem - that after deletion of the 10,000 items, memory is not freed up? Are we talking about free memory reported by operating system, or free space on the heap? It is possible (and very likely), that some other object was allocated after the position of the data of the vector, so it is not possible to simply free this memory to operating system; but it can be reused for other, newly created objects.
I suggest you to take a look at these two idioms and pick the one that fits you the most:
Shrink to fit
Clear & minimize
The std::vector
class automatically manages its internal memory. It will expand to hold as many items as you put into it, but in general it will not shrink on its own as you remove items (although it will of course release the memory when it destructs).
The std::vector
has two relevant concepts of "size". First is the "reserved" size, which is how much memory it has allocated from the system to use for storing vector elements. The second is the "used" size, which is how many elements are logically in the vector. Clearly, the reserved size must be at least as large as the used size. You can discover the used size with the size()
method (which I'm sure you already know), and you can discover the reserved size using the capacity()
method.
Usually, when the used and reserved sizes are the same, and you try to insert a new element, the vector will allocate a new internal buffer of twice the previous reserved size, and copy all of the existing elements into that buffer. This is transparent to you except that it will invalidate any iterators that you are holding. As I noted before, AFAIK, most STL implementations will never shrink the reserved size as a response to an erasure.
Unfortunately, while you can force a vector to increase its reserved size using the reserve()
method, this does not work for decreasing the reserved capacity. As far as I can tell, your best bet for effecting a reduction in capacity is to do the following:
std::vector<Bullet>(myVector).swap(myVector);
What this will do is create a temporary vector which is a copy of the original vector (but with the minimum necessary capacity), and then swap the internal buffers of the two vectors. This will cause your original vector to have the same data but a potentially smaller reserved size.
Now, because creating that temporary copy is a relatively expensive operation (i.e. it takes a lot more processor time than normal reads/insertions/deletions), you don't want to do it every time you erase an element. For the same reason, this is why the vector doubles its reserved size rather than increasing it by 1 when you need to exceed the existing size. Therefore, what I would recommend is that after you have erased a relatively large number of elements, and you know you will not be adding that many more any time soon, perform the swap 'trick' above to reduce the capacity.
Finally, you may also want to consider using something other than a std::vector
for this. Erasing elements from the middle of a vector, which it seems that you are doing frequently, is a slow operation compared to many other types of data structures (since the vector has to copy all of the subsequent elements back one slot to fill the hole). Which data structure is best for your purposes depends on what else you are doing with the data.