问题
I know that manual dynamic memory allocation is a bad idea in general, but is it sometimes a better solution than using, say, std::vector
?
To give a crude example, if I had to store an array of n
integers, where n
<= 16, say. I could implement it using
int* data = new int[n]; //assuming n is set beforehand
or using a vector:
std::vector<int> data;
Is it absolutely always a better idea to use a std::vector
or could there be practical situations where manually allocating the dynamic memory would be a better idea, to increase efficiency?
回答1:
It is always better to use std::vector
/std::array
, at least until you can conclusively prove (through profiling) that the T* a = new T[100];
solution is considerably faster in your specific situation. This is unlikely to happen: vector
/array
is an extremely thin layer around a plain old array. There is some overhead to bounds checking with vector::at
, but you can circumvent that by using operator[]
.
回答2:
I can't think of any case where dynamically allocating a C style
vector makes sense. (I've been working in C++ for over 25
years, and I've yet to use new[]
.) Usually, if I know the
size up front, I'll use something like:
std::vector<int> data( n );
to get an already sized vector, rather than using push_back
.
Of course, if n
is very small and is known at compile time,
I'll use std::array
(if I have access to C++11), or even
a C style array, and just create the object on the stack, with
no dynamic allocation. (Such cases seem to be rare in the
code I work on; small fixed size arrays tend to be members of
classes. Where I do occasionally use a C style array.)
回答3:
If you know the size in advance (especially at compile time), and don't need the dynamic re-sizing abilities of std::vector
, then using something simpler is fine.
However, that something should preferably be std::array
if you have C++11, or something like boost::scoped_array
otherwise.
I doubt there'll be much efficiency gain unless it significantly reduces code size or something, but it's more expressive which is worthwhile anyway.
回答4:
You should try to avoid C
-style-arrays in C++
whenever possible. The STL
provides containers which usually suffice for every need. Just imagine reallocation for an array or deleting elements out of its middle. The container shields you from handling this, while you would have to take care of it for yourself, and if you haven't done this a hundred times it is quite error-prone.
An exception is of course, if you are adressing low-level-issues which might not be able to cope with STL
-containers.
There have already been some discussion about this topic. See here on SO.
回答5:
Is it absolutely always a better idea to use a std::vector or could there be practical situations where manually allocating the dynamic memory would be a better idea, to increase efficiency?
Call me a simpleton, but 99.9999...% of the times I would just use a standard container. The default choice should be std::vector
, but also std::deque<> could be a reasonable option sometimes. If the size is known at compile-time, opt for std::array<>
, which is a lightweight, safe wrapper of C-style arrays which introduces zero overhead.
Standard containers expose member functions to specify the initial reserved amount of memory, so you won't have troubles with reallocations, and you won't have to remember delete[]
ing your array. I honestly do not see why one should use manual memory management.
Efficiency shouldn't be an issue, since you have throwing and non-throwing member functions to access the contained elements, so you have a choice whether to favor safety or performance.
回答6:
std::vector could be constructed with an size_type parameter that instantiate the vector with the specified number of elements and that does a single dynamic allocation (same as your array) and also you can use reserve to decrease the number of re-allocations over the usage time.
回答7:
In n
is known at compile-time, then you should choose std::array
as:
std::array<int, n> data; //n is compile-time constant
and if n
is not known at compile-time, OR the array might grow at runtime, then go for std::vector
:
std::vector<int> data(n); //n may be known at runtime
Or in some cases, you may also prefer std::deque
which is faster than std::vector
in some scenario. See these:
C++ benchmark – std::vector VS std::list VS std::deque
Using Vector and Deque by Herb Sutter
Hope that helps.
来源:https://stackoverflow.com/questions/15294129/overhead-to-using-stdvector