[Bug c++/19769] [4.0/4.1 Regression] GCC produces wrong dwarf2 output that breaks gdb

2005-03-18 Thread cvs-commit at gcc dot gnu dot org

--- Additional Comments From cvs-commit at gcc dot gnu dot org  2005-03-19 
03:07 ---
Subject: Bug 19769

CVSROOT:/cvs/gcc
Module name:gcc
Changes by: [EMAIL PROTECTED]   2005-03-19 03:06:52

Modified files:
gcc: ChangeLog dwarf2out.c 

Log message:
Fix problem that caused compiled java code to trigger an internal gdb 
error.
PR c++/19769
* dwarf2out.c (declare_in_namespace): Ignore decls with an abstract
origin.

Patches:
http://gcc.gnu.org/cgi-bin/cvsweb.cgi/gcc/gcc/ChangeLog.diff?cvsroot=gccr1=2.7907r2=2.7908
http://gcc.gnu.org/cgi-bin/cvsweb.cgi/gcc/gcc/dwarf2out.c.diff?cvsroot=gccr1=1.571r2=1.572



-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19769


[Bug c++/19769] [4.0/4.1 Regression] GCC produces wrong dwarf2 output that breaks gdb

2005-03-18 Thread dberlin at dberlin dot org

--- Additional Comments From dberlin at gcc dot gnu dot org  2005-03-19 
03:50 ---
Subject: Re:  [4.0/4.1 Regression] GCC produces wrong dwarf2
output that breaks gdb

On Sat, 2005-03-19 at 03:07 +, cvs-commit at gcc dot gnu dot org
wrote:
 --- Additional Comments From cvs-commit at gcc dot gnu dot org  
 2005-03-19 03:07 ---
 Subject: Bug 19769
 
 CVSROOT:  /cvs/gcc
 Module name:  gcc
 Changes by:   [EMAIL PROTECTED]   2005-03-19 03:06:52
 


Just FYI (Maybe you are and the log message hasn't been processed by
bugzilla yet), this needs to go on the 4.0 branch as well, since the bug
exists there too.

Thanks so much for fixing this bug.




-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19769


[Bug c++/19769] [4.0/4.1 Regression] GCC produces wrong dwarf2 output that breaks gdb

2005-03-17 Thread bothner at gcc dot gnu dot org

--- Additional Comments From bothner at gcc dot gnu dot org  2005-03-17 
19:17 ---
(In reply to comment #18)
I tried Jim's patch, and it seems to work.
(I haven't done a full-boostrap, but I compiled jc1, cc1plus, and libjava, and
re-installed them.)
I then re-compiled Kawa, and I seem to be able to debug.

-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19769


[Bug c++/19769] [4.0/4.1 Regression] GCC produces wrong dwarf2 output that breaks gdb

2005-03-17 Thread wilson at gcc dot gnu dot org

--- Additional Comments From wilson at gcc dot gnu dot org  2005-03-18 
06:05 ---
I got four additional gdb testsuite failures with the patch.  I will have to
figure out what went wrong, and then rebuild and retest.

-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19769


[Bug c++/19769] [4.0/4.1 Regression] GCC produces wrong dwarf2 output that breaks gdb

2005-03-15 Thread wilson at gcc dot gnu dot org

--- Additional Comments From wilson at gcc dot gnu dot org  2005-03-16 
03:46 ---
I confirmed that I don't see the problem if I compile with -m32 on a x86-64 
system.

This happens with 32-bit x86 code because update_equiv_regs deletes and
reinserts the instruction that loads the value of the variable i.  This causes
us to lose the block info for this insn, as block info is based on insn uid. 
This does not happen for 64-bit code because CLASS_LIKELY_SPILLED_P is true for
the register destination.  For 32-bit code, the register has a preferred class
of GENERAL_REGS, and for 64-bit code, the register has a preferred class of
DIREG_CLASS.  So the 32-bit x86 case doesn't fail because we lost some debug 
info.

I can make the 32-bit case fail by adding the -fpic option, which prevents
update_equiv_regs from optimizing this instructions.

So
  ./xgcc -B./ -O -g -fpic -m32 tmp.cc
reproduces the problem for me on an x86_64 machine.  It should do likewise on a
32-bit machine, without the -m32 option.

-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19769


[Bug c++/19769] [4.0/4.1 Regression] GCC produces wrong dwarf2 output that breaks gdb

2005-03-15 Thread wilson at gcc dot gnu dot org

--- Additional Comments From wilson at gcc dot gnu dot org  2005-03-16 
03:54 ---
Part of the problem here is the way that declare_in_namespace works. 
gen_variable_die, calls declare_in_namespace, which checks to see if the
variable has a namespace, and if so emits it into that namespace. 
declare_in_namespace then returns, and gen_variable_die finishes normally, which
means we end up with DIEs for one variable.  The local one in the current
context, and the one emitted into the namespace.  The variable in question here
has a namespace of ::.

When we process the abstract instance of f, we emit a local die for i into f,
and a namespace one into :: the global context.  This one has line 3.  Then we
see the global redeclaration, and emit another die into the global context. 
This one has line 6.  Then when we handle the function main, we see the inlined
version of f, and we get a decl of i with an abstract origin pointing at the
function f, and emit two more dies, one local one in the inlined version of f
that points at the abstract instance of f, and one global one that also points
at the abstract instance of f.  This last one is very wrong.  Note that there
are 5 dies not 4 as I claimed earlier, I missed the one in 'main' when I was
counting them.

It seems very silly to try to emit a namespace version of a decl with an
abstract origin.  One possible solution is for declare_in_namespace to ignore
any decl with an abstract origin.  If I try this, I get only 4 dies, and gdb is
happy.  We still have two global dies, with different line numbers, but we can
live with that for now.

I haven't yet tried to trace the history of this, to determine what change
caused the problem to appear.  There might be something else wrong here.

-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19769


[Bug c++/19769] [4.0/4.1 Regression] GCC produces wrong dwarf2 output that breaks gdb

2005-03-15 Thread dberlin at dberlin dot org

--- Additional Comments From dberlin at gcc dot gnu dot org  2005-03-16 
04:25 ---
Subject: Re:  [4.0/4.1 Regression] GCC produces wrong dwarf2
output that breaks gdb

On Wed, 2005-03-16 at 03:54 +, wilson at gcc dot gnu dot org wrote:
 --- Additional Comments From wilson at gcc dot gnu dot org  2005-03-16 
 03:54 ---
 Part of the problem here is the way that declare_in_namespace works. 
 gen_variable_die, calls declare_in_namespace, which checks to see if the
 variable has a namespace, and if so emits it into that namespace. 
 declare_in_namespace then returns, and gen_variable_die finishes normally, 
 which
 means we end up with DIEs for one variable.  The local one in the current
 context, and the one emitted into the namespace.  The variable in question 
 here
 has a namespace of ::.
 
 When we process the abstract instance of f, we emit a local die for i into f,
 and a namespace one into :: the global context.  This one has line 3.  Then we
 see the global redeclaration, and emit another die into the global context. 
 This one has line 6.  Then when we handle the function main, we see the 
 inlined
 version of f, and we get a decl of i with an abstract origin pointing at the
 function f, and emit two more dies, one local one in the inlined version of f
 that points at the abstract instance of f, and one global one that also points
 at the abstract instance of f.  This last one is very wrong.  Note that there
 are 5 dies not 4 as I claimed earlier, I missed the one in 'main' when I was
 counting them.
 
 It seems very silly to try to emit a namespace version of a decl with an
 abstract origin.  One possible solution is for declare_in_namespace to ignore
 any decl with an abstract origin.  If I try this, I get only 4 dies, and gdb 
 is
 happy.  We still have two global dies, with different line numbers, but we can
 live with that for now.
 
I'm pretty sure declare_in_namespace was something either jason wrote
with my help, or i wrote with jason's help (i can't remember), and i'm
pretty sure this problem didn't occur to either of us.

I agree that  outputting a namespace version of an inline die with an
abstract origin is indeed silly, since i don't believe it could possibly
be the one that actually belongs in that namespace (which would be
whatever the actual abstract origin points to).


 I haven't yet tried to trace the history of this, to determine what change
 caused the problem to appear.  There might be something else wrong here.
 



-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19769


[Bug c++/19769] [4.0/4.1 Regression] GCC produces wrong dwarf2 output that breaks gdb

2005-03-12 Thread wilson at gcc dot gnu dot org

--- Additional Comments From wilson at gcc dot gnu dot org  2005-03-13 
03:11 ---
Dan's example doesn't work because 'int' is a predefined type.  Use a unique
structure type put the function call back in the inline function, and I get a
nice little example (22 lines) that can reproduce the problem.  If I compile it 
with
./xgcc -B./ -O -g tmp.cc
and then run gdb on the a.out file, I get a gdb internal error.

Looking at the debug info, I now have 4 DIEs for the variable i.  There are the
same 3 as Dan's example, plus a fourth one, with global scope, that has the
abstract origin pointing back at the declaration inside the inline function.  It
is this fourth one that confuses gdb.

This is using mainline, last updated Thursday March 10, on an x86-64 linux 
machine.

-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19769


[Bug c++/19769] [4.0/4.1 Regression] GCC produces wrong dwarf2 output that breaks gdb

2005-03-12 Thread drow at false dot org

--- Additional Comments From drow at false dot org  2005-03-13 03:36 ---
Subject: Re:  [4.0/4.1 Regression] GCC produces wrong dwarf2 output that breaks 
gdb

Hmm, I can't reproduce the error using mainline for i386-linux, and
several versions of GDB.  Could you attach readelf -wi output for the
file which does cause gdb to crash?


-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19769


[Bug c++/19769] [4.0/4.1 Regression] GCC produces wrong dwarf2 output that breaks gdb

2005-03-09 Thread pinskia at gcc dot gnu dot org

--- Additional Comments From pinskia at gcc dot gnu dot org  2005-03-09 
19:52 ---
I almost think this is a C++ front-end bug.

-- 
   What|Removed |Added

  Component|debug   |c++


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19769


[Bug c++/19769] [4.0/4.1 Regression] GCC produces wrong dwarf2 output that breaks gdb

2005-03-09 Thread jbuck at gcc dot gnu dot org

--- Additional Comments From jbuck at gcc dot gnu dot org  2005-03-09 20:10 
---
Severity raised to critical since it breaks debugging of all Java programs
(gdb dies when loading libgcj).

-- 
   What|Removed |Added

   Severity|normal  |critical


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19769


[Bug c++/19769] [4.0/4.1 Regression] GCC produces wrong dwarf2 output that breaks gdb

2005-03-09 Thread dberlin at dberlin dot org

--- Additional Comments From dberlin at gcc dot gnu dot org  2005-03-09 
20:28 ---
Subject: Re:  [4.0/4.1 Regression] GCC produces wrong dwarf2
output that breaks gdb

On Wed, 2005-03-09 at 20:10 +, jbuck at gcc dot gnu dot org wrote:
 --- Additional Comments From jbuck at gcc dot gnu dot org  2005-03-09 
 20:10 ---
 Severity raised to critical since it breaks debugging of all Java programs
 (gdb dies when loading libgcj).
 

Just FYI, there are two possibilities here:

1. Something is lying to the dwarf2 outputter about the origins of
various things, and which are real instances, and which are not,
2. The dwarf2 outputter is ignoring this and making assumptions about
how things relate to each other based on the order they are output in.


Hopefully this bug is #1, and somewhere in the C++ FE.
The fact that the obvious C version doesn't cause a problem makes me
think this may be the case.
It's also theoretically easier to fix.

If it's #2, it will be a serious pain to fix this, because the real fix
is to make the dwarf2 outputter not depend on the order you hand it
decls in order to get correct debug info.

--Dan




-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19769