I know that manual dynamic memory allocation is a bad idea in general, but is it sometimes a better solution than using, say, std::vector
?
To give a crude e
I can't think of any case where dynamically allocating a C style
vector makes sense. (I've been working in C++ for over 25
years, and I've yet to use new[]
.) Usually, if I know the
size up front, I'll use something like:
std::vector data( n );
to get an already sized vector, rather than using push_back
.
Of course, if n
is very small and is known at compile time,
I'll use std::array
(if I have access to C++11), or even
a C style array, and just create the object on the stack, with
no dynamic allocation. (Such cases seem to be rare in the
code I work on; small fixed size arrays tend to be members of
classes. Where I do occasionally use a C style array.)