Consider a simple program:
int main() {
int* ptr = nullptr;
delete ptr;
}
With GCC (7.2), there is a call
instruction regardi
I think, the compiler has no knowledge about "delete", especially that "delete null" is a NOOP.
You may write it explicit, so the compiler does not need to imply knowledge about delete.
WARNING: I do not recommend this as general implementation. The following example should show, how you could "convince" a limited compiler to remove code anyway in that very special and limited program
int main() {
int* ptr = nullptr;
if (ptr != nullptr) {
delete ptr;
}
}
Where I remember right, there is a way to replace "delete" with an own function. And in the case an optimization by the compiler would went wrong.
@RichardHodges: Why should it be an de-optimization when one give the compiler the hint to remove a call?
delete null is in general a NOOP (no operation). However, since it is possible to replace or overwrite delete there is no guarantee for all cases.
So it is up to the compiler to know and to decide whether to use the knowledge that delete null could always removed. There are good arguments for both choises
However, the compiler is always allowed to remove dead code, this "if (false) {...}" or "if (nullptr != nullptr) {...}"
So a compiler will remove dead code and then when using explicit checking, it looks like
int main() {
int* ptr = nullptr;
// dead code if (ptr != nullptr) {
// delete ptr;
// }
}
Please tell me, where is there a de-optimization?
I call my proposal a defensive style of coding, but not a de-optimization
If someone may argue, that now the non-nullptr will causes two-times checking on nullptr, I have to reply
@Peter Cordes: I agree guarding with an if is not an general optimization rule. However, general optimization was NOT the question of the opener. The question was why some compiler do not elimate the delete in a very short, non-sense program. I showed a way to make the compiler to eliminate it anyway.
If a situation happen like in that short program, probably something other is wrong. In general I would try to avoid new/delete (malloc/free) as the calls are rather expensive. If possible I prefer to use the stack (auto).
When I take a look at the meanwhile documented real case, I would say, class X is designed wrong, causing poor performance and too much memory. (https://godbolt.org/g/7zGUvo)
Instead of
class X {
int* i_;
public:
...
in would design
class X {
int i;
bool valid;
public:
...
or more earlier, I would ask of the sense of sorting empty/invalid items. In the end I would like to get rid of "valid", too.