I know that manual dynamic memory allocation is a bad idea in general, but is it sometimes a better solution than using, say, std::vector
?
To give a crude e
I can't think of any case where dynamically allocating a C style
vector makes sense. (I've been working in C++ for over 25
years, and I've yet to use new[]
.) Usually, if I know the
size up front, I'll use something like:
std::vector<int> data( n );
to get an already sized vector, rather than using push_back
.
Of course, if n
is very small and is known at compile time,
I'll use std::array
(if I have access to C++11), or even
a C style array, and just create the object on the stack, with
no dynamic allocation. (Such cases seem to be rare in the
code I work on; small fixed size arrays tend to be members of
classes. Where I do occasionally use a C style array.)
It is always better to use std::vector
/std::array
, at least until you can conclusively prove (through profiling) that the T* a = new T[100];
solution is considerably faster in your specific situation. This is unlikely to happen: vector
/array
is an extremely thin layer around a plain old array. There is some overhead to bounds checking with vector::at
, but you can circumvent that by using operator[]
.
In n
is known at compile-time, then you should choose std::array
as:
std::array<int, n> data; //n is compile-time constant
and if n
is not known at compile-time, OR the array might grow at runtime, then go for std::vector
:
std::vector<int> data(n); //n may be known at runtime
Or in some cases, you may also prefer std::deque
which is faster than std::vector
in some scenario. See these:
C++ benchmark – std::vector VS std::list VS std::deque
Using Vector and Deque by Herb Sutter
Hope that helps.
If you know the size in advance (especially at compile time), and don't need the dynamic re-sizing abilities of std::vector
, then using something simpler is fine.
However, that something should preferably be std::array
if you have C++11, or something like boost::scoped_array
otherwise.
I doubt there'll be much efficiency gain unless it significantly reduces code size or something, but it's more expressive which is worthwhile anyway.
You should try to avoid C
-style-arrays in C++
whenever possible. The STL
provides containers which usually suffice for every need. Just imagine reallocation for an array or deleting elements out of its middle. The container shields you from handling this, while you would have to take care of it for yourself, and if you haven't done this a hundred times it is quite error-prone.
An exception is of course, if you are adressing low-level-issues which might not be able to cope with STL
-containers.
There have already been some discussion about this topic. See here on SO.
std::vector could be constructed with an size_type parameter that instantiate the vector with the specified number of elements and that does a single dynamic allocation (same as your array) and also you can use reserve to decrease the number of re-allocations over the usage time.