问题
When I want to do debugging of C or C++ programs, I've been taught to use -O0
to turn optimization OFF, and -ggdb
to insert symbols into the executable which are optimized for using the GNU gdb
debugger, which I use (or, you can use -glldb
for LLVM/clang's lldb
debugger, or just -g
for general debugging symbols, but that won't be as good as -ggdb
apparently...). However, I recently stumbled upon someone saying to use -Og
(instead of -O0
), and it caught me off-guard. Sure enough though, it's in man gcc
!:
-Og
Optimize debugging experience.-Og
enables optimizations that do not interfere with debugging. It should be the optimization level of choice for the standard edit-compile-debug cycle, offering a reasonable level of optimization while maintaining fast compilation and a good debugging experience.
So, what's the difference? Here's the -O0
description from man gcc
:
-O0
Reduce compilation time and make debugging produce the expected results. This is the default.
man gcc
clearly says -Og
"should be the optimization level of choice for the standard edit-compile-debug cycle", though.
This makes it sound like -O0
is truly "no optimizations", whereas -Og
is "some optimizations on, but only those which don't interfere with debugging." Is this correct? So, which should I use, and why?
Related:
- related, but NOT a duplicate! (read it closely, it's not at all a duplicate): What is the difference between -O0 ,-O1 and -g
- my answer on debugging
--copt=
settings to use with Bazel: gdb: No symbol "i" in current context
回答1:
@kaylum just provided some great insight in their comment under my question! And the key part I really care about the most is this:
[
-Og
] is a better choice than -O0 for producing debuggable code because some compiler passes that collect debug information are disabled at -O0.
https://gcc.gnu.org/onlinedocs/gcc/Optimize-Options.html#Optimize-Options
So, from now on I'm using -Og
(NOT -O0
) in addition to -ggdb
.
UDPATE 13 Aug. 2020:
Heck with this! Nevermind. I'm sticking with -O0
.
With -Og
I get <optimized out>
and Can't take address of "var" which isn't an lvalue.
errors all over the place! I can't print my variables or examine their memory anymore! Ex:
(gdb) print &angle
Can't take address of "angle" which isn't an lvalue.
(gdb) print angle_fixed_p
$6 = <optimized out>
With -O0
, however, everything works fine!
(gdb) print angle
$7 = -1.34869879e+20
(gdb) print &angle
$8 = (float *) 0x7ffffffefbbc
(gdb) x angle
0x8000000000000000: Cannot access memory at address 0x8000000000000000
(gdb) x &angle
0x7ffffffefbbc: 0xe0e9f642
So, back to using -O0
instead of -Og
it is!
Related:
- [they also recommend
-O0
, and I concur] What does <value optimized out> mean in gdb?
来源:https://stackoverflow.com/questions/63386189/whats-the-difference-between-a-compilers-o0-option-and-og-option