I was reading this paper on undefined behaviour and one of the example \"optimisations\" looks highly dubious:
if (arg2 == 0) ereport(ERROR,
Most functions are assumed to eventually return. There are compiler-specific extensions in some compilers to inform the compiler that a function will never return.
__attribute__ ((noreturn))
does this for gcc.
Is the compiler free to assume that a function will always return?
It is not legal in C or C++ for a compiler to optimize on that basis, unless it somehow specifically knows that ereport
returns (for example by inlining it and inspecting the code).
ereport
depends on at least one #define
and on the values passed in, so I can't be sure, but it certainly looks to be designed to conditionally not return (and it calls an extern function errstart
that, as far as the compiler knows, may or may not return). So if the compiler really is assuming that it always returns then either the compiler is wrong, or the implementation of ereport
is wrong, or I've completely misunderstood it.
The paper says,
However, the programmer failed to inform the compiler that the call to ereport(ERROR, ::: ) does not return.
I don't believe that the programmer has any such obligation, unless perhaps there's some non-standard extension in effect when compiling this particular code, that enables an optimization that's documented to break valid code under certain conditions.
Unfortunately it is rather difficult to prove the code transformation is incorrect by citing the standard, since I can't quote anything to show that there isn't, tucked away somewhere in pages 700-900, a little clause that says "oh, by the way, all functions must return". I haven't actually read every line of the standard, but such a clause would be absurd: functions need to be allowed to call abort()
or exit()
or longjmp()
. In C++ they can also throw exceptions. And they need to be allowed to do this conditionally -- the attribute noreturn
means that the function never returns, not that it might not return, and its absence proves nothing about whether the function returns or not. My experience of both standards is that they aren't (that) absurd.
Optimizations are not allowed to break valid programs, they're constrained by the "as-if" rule that observable behaviour is preserved. If ereport
doesn't return then the "optimization" changes the observable behaviour of the program (from doing whatever ereport
does instead of returning, to having undefined behaviour due to the division by zero). Hence it is forbidden.
There's more information on this particular issue here:
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=616180
It mentions a GCC bug report http://gcc.gnu.org/bugzilla/show_bug.cgi?id=29968 that was (rightly IMO) rejected, but if ereport
doesn't return then the PostGreSQL issue is not the same as that rejected GCC bug report.
In the debian bug description is the following:
The gcc guys are full of it. The issue that is relevant here is the C standard's definition of sequence points, and in particular the requirement that visible side effects of a later statement cannot happen before the execution of an earlier function call. The last time I pestered them about this, I got some lame claim that a SIGFPE wasn't a side effect within the definitions of the spec. At that point useful discussion stopped, because it's impossible to negotiate with someone who's willing to claim that.
In point of fact, if a later statement has UB then it is explicitly stated in the standard that the whole program has UB. Ben has the citation in his answer. It is not the case (as this person seems to think) that all visible side effects must occur up to the last sequence point before the UB. UB permits inventing a time machine (and more prosaically, it permits out of order execution that assumes everything executed has defined behaviour). The gcc guys are not full of it if that's all they say.
A SIGFPE would be a visible side effect if the compiler chooses to guarantee and document (as an extension to the standard) that it occurs, but if it's just the result of UB then it is not. Compare for example the -fwrapv
option to GCC, which changes integer overflow from UB (what the standard says) to wrap-around (which the compiler guarantees, only if you specify the option). On MIPS, gcc has an option -mcheck-zero-division
, which looks like it does define behaviour on division by zero, but I've never used it.
It's possible that the authors of the paper noticed the wrongness of that complaint against GCC, and the thought that one of the PostGreSQL authors was wrong in this way influenced them when they put the snigger quotes into:
We found seven similar issues in PostgreSQL, which were noted as “GCC bugs” in source code comments
But a function not returning is very different from a function returning after some side effects. If it doesn't return, the statement that would have UB is not executed within the definition of the C (or C++) abstract machine in the standard. Unreached statements aren't executed: I hope this isn't contentious. So if the "gcc guys" were to claim that UB from unreached statements renders the whole program undefined, then they'd be full of it. I don't know that they have claimed that, and at the end of the Debian report there's a suggestion that the issue might have gone away by GCC 4.4. If so then perhaps PostGreSQL indeed had encountered an eventually-acknowledged bug, not (as the author of the paper you link to thinks) a valid optimization or (as the person who says the gcc guys are full of it thinks) a misinterpretation of the standard by GCC's authors.
In embedded systems, functions that never return are commonplace. They should not be optimized either.
For example, a common algorithm is to have a forever loop in main()
(a.k.a. the background loop), and all functionality takes place in an ISR (Interrupt Service Routine).
Another example are RTOS tasks. In our embedded system project, we have tasks that are in an infinte loop: Pend on message queue, process message, repeat. They will do this for the life of the project.
Some embedded systems have safe shutdown loops where they place the machine into a safe state, locking out all User Input, and wait for power shutdown or reset.
Also, some embedded systems can shutdown the system. Shutting down the power prevents the system from returning.
There are reasons that not all functions need to return or must be required to return. If all functions returned that are in your cell phone, you wouldn't be fast enough to use it.
It seems to me that unless the compiler can prove that ereport()
doesn't call exit()
or abort()
or some other mechanism for program termination then this optimization is invalid. The language standard mentions several mechanisms for termination, and even defines the 'normal' program termination via returning from main()
in terms of the exit()
function.
Not to mention that program termination isn't necessary to avoid the division expression. for (;;) {}
is perfectly valid C.
No, in the newest C standard, C11, there is even an new keyword to specify that a function would not return, _Noreturn
.
The paper does not say that the if (arg2 == 0)
check is removed. It says that the division is moved before the check.
Quoting the paper:
... GCC moves the division before the zero check
arg2 == 0
, causing division by zero.
The result is the same, but the reasoning is different.
If the compiler believes ereport
will return, then it "knows" that the division will be performed in all cases. Furthermore, the if-statement doesn't affect the arguments of the division. And obviously, the division doesn't affect the if-statement. And while call to ereport
might have observable side effects, the division does not (if we ignore any divide-by-zero exception).
Therefore, the compiler believes the as-if rule gives it the freedom to reorder these statements with respect to each other--it can move the division before the test because the observable behavior should be identical (for all of the cases that yield defined behavior).
One way to look at it is that undefined behavior includes time travel. ;-)
I'd argue that undefined behavior (e.g., dividing by 0), should be considered observable behavior. That would prevent this reordering because the observable behavior of the division must happen after the observable behavior of the call to ereport
. But I don't write standards or compilers.