I am aware that the size of a pointer is fixed (not the size of the data it points to). Now given that, supposing I have a vector of data in global scope and I declare a pointer
The memory consumption of the vector is close to what you guessed. You're right that the size of the vector will be independent of the layout of data
.
As for actually calculating the total size of the vector, there are two parts that contribute:
The actual vector object itself. A vector has some internal structure, for example, it has a pointer to it's internal array. Also, all its field are aligned to certain values. The total size of any vector (including the alignment) can be found using sizeof(vector<data*>)
. On my machine this gives 24
The size of the internal array. This is not based on the size()
of the vector, but is actually dependent on the capacity()
of the vector. The total size will be my_data->capacitiy() * sizeof(data*)
. I believe there is no padding between array elements, but if there is, this will also have to be accounted for.
The total memory consumption is then just the sum of #1 and #2.
If I understand your question right, you want to know how much memory your vector consumes. This will be the fixed size of a pointer (32 or 64 bit) times the amount of pointers stored in it, plus some extra bytes for the size field and stuff. So in most cases you won't recognise the memory it uses.
However, you should definitly not do it like you did in your code example. Because a std::vector my change the position where it saves your data. E.g. if you push_back new objects, it eventually will have to allocate new memory and copy the data to the newly allocated space, since it guarantees that the data space is continues. It then will free the memory used bevore. Your old Pointers will then point into space which is not used by your programm any more causing segmentation faults if you try to use them.
Due to "virtual memory" it is not so simple to say how much "RAM" will be used, but we can talk about how much virtual memory will be consumed. And the answer is roughly as you expected, plus a bit more:
If you wanted to express that in C++, you might do something like this:
sizeof(vector<data*>) + my_data.capacity() * sizeof(data*);
Note that this just gives you a rough guess, and ignores more complex pieces like whether more "RAM" is needed to actually do the mapping of the memory you are using in application space, and the behavior of the standard allocator on your system, etc. But as far as C++ and virtual memory are concerned, I think it's a reasonable approximation.
Re: I am hoping that the additional memory consumption would be just the size of vector times the fixed size of a pointer (say, 8 bytes) regardless of the complexity of the data; since the data it points to exists in global scope (i.e., no new data is being allocated; hope I've explained my thoughts clearly).
You are basically right. For any reasonable implementation of std::vector
, the underlying storage for a std::vector<T>
is just a compact array of T
. In this case, your T
is the pointer type data *
and not data
.
There may be some extra storage for efficient expansion, oterhwise every push_back
operation would have to grow the array. (Take a look a the reserve
and capacity
functions of std::vector
.)
And of course there is some small overhead for allocating the vector object itself.
my_data
will occupy sizeof(std::vector<data*>*)
bytes, and no more than sizeof(void*)
.
*my_data
will occupy sizeof(std::vector<data*>)
, which can be as little as 3 * sizeof(data**)
, i.e. no more than 3 * sizeof(void*)
.
*my_data
will manage sizeof(data*) * my_data->capacity()
bytes of dynamic memory.
I have grave doubts that that information will be useful in practice.