https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78685
Bug ID: 78685 Summary: -Og generates too many "<optimized out>"s Product: gcc Version: 6.2.1 Status: UNCONFIRMED Severity: normal Priority: P3 Component: debug Assignee: unassigned at gcc dot gnu.org Reporter: eggert at gnu dot org Target Milestone: --- Host: x86-64 Created attachment 40253 --> https://gcc.gnu.org/bugzilla/attachment.cgi?id=40253&action=edit preprocessed C program illustrating -Og problem Emacs developers are having trouble using -Og to debug Emacs, and so are still using -O0; see, for example: http://lists.gnu.org/archive/html/emacs-devel/2016-12/msg00199.html I reproduced the problem and came up with a small program (attached) that illustrates it. Here's a sample transcript on Fedora 24 x86-64, which uses gcc (GCC) 6.2.1 20160916 (Red Hat 6.2.1-2): $ gcc -Og -g3 Ogbug.i $ gdb a.out GNU gdb (GDB) Fedora 7.11.1-86.fc24 ... Reading symbols from a.out...done. (gdb) b call_debugger Breakpoint 1 at 0x4004d6: file Ogbug.i, line 6. (gdb) r Starting program: /home/eggert/junk/a.out Breakpoint 1, call_debugger (x=3) at Ogbug.i:6 6 v = x; (gdb) bt #0 call_debugger (x=3) at Ogbug.i:6 #1 0x0000000000400519 in apply_lambda (fun=1, args=2, count=<optimized out>) at Ogbug.i:14 #2 0x0000000000400547 in main (argc=<optimized out>, argv=<optimized out>) at Ogbug.i:22 It's hard to debug a program with all those "<optimized out>"s getting in the way.