rjmccall added inline comments.

================
Comment at: clang/lib/CodeGen/CGDebugInfo.cpp:2369
+              ? CGM.getObjCRuntime().ComputeBitfieldBitOffset(CGM, ID, Field)
+              : 0;
     } else {
----------------
JDevlieghere wrote:
> aprantl wrote:
> > JDevlieghere wrote:
> > > JDevlieghere wrote:
> > > > aprantl wrote:
> > > > > aprantl wrote:
> > > > > > It might help to attempt some git blame archeology.
> > > > > > Judging from the comment, it sounds like the debugger is supposed 
> > > > > > to query the runtime for the *byte* offset and then add the bit 
> > > > > > offset from DWARF? Could that make sense?
> > > > > If that is the case, we'd need to relax llvm-dwarfdump --verify to 
> > > > > accept this and make sure LLDB does the right thing instead.
> > > > Ah I see, yeah that sounds reasonable and explains the comment which I 
> > > > interpreted differently. Thanks! 
> > > btw it was added in rL167503. 
> > We should check whether emitting the offsets like this violates the DWARF 
> > spec. If yes, it may be better to emit the static offsets like you are 
> > doing here and then still have LLDB ignore everything but the bit-offsets 
> > from the dynamic byte offset.
> The standard says 
> 
> > The member entry corresponding to a data member that is defined in a 
> > structure,
> > union or class may have either a DW_AT_data_member_location attribute or a
> > DW_AT_data_bit_offset attribute.
> 
> which to me sounds like they should be mutually exclusive. I ran the lldb 
> test suite with my change and there were no new failures, which leads me to 
> believe that the comment from r167503 still holds and lldb ignores this 
> attribute, at least for Objective-C.
In Clang's current code-generation, the ivar offset variable stores the offset 
of the byte containing the first bit of the bit-field, which is not necessarily 
the same as the offset of the byte at which the bit-field's storage begins.  
This is why the debug info `%`s by the number of bits in a `char`: it's the 
correct bit-offset for computing where the bit-field begins relative to the 
value in the ivar offset variable.  See the behavior of 
`CGObjCRuntime::EmitValueForIvarAtOffset`, which does the same `%` when 
computing the layout.

It seems right to me that the `%` is done consistently within the compiler, 
rather than expecting LLDB to do it.  That's the more future-proof design; for 
example, it would allow the compiler to emit a single ObjC ivar record for the 
combined bit-field storage and then emit debug info for the bit-fields at their 
appropriate relative bit-offsets, which could then be `>= 8`.  This would 
arguably be a better code-generation scheme anyway, for a number of reasons — 
at the cost of breaking reflective access to the individual ivars, which I 
don't believe the runtime really supports anyway.

Anyway, the upshot is that I don't think this patch is correct.


Repository:
  rC Clang

https://reviews.llvm.org/D51990



_______________________________________________
cfe-commits mailing list
cfe-commits@lists.llvm.org
http://lists.llvm.org/cgi-bin/mailman/listinfo/cfe-commits

Reply via email to