In x86, a software breakpoint is generally only one byte, the
int3 instruction which is defined for that purpose. It's put at the start
of whatever instruction you want to catch. In other ISAs like ARM which
have different instruction sizes, I'm assuming gdb either figures out what
the size of the target instruction is (thumb or ARM, for instance), or gdb
might have to be explicitly put into the right mode to handle those
instructions. For instance, in x86, you can tell gdb to assume the target
is running 16 bit real mode code so that it disassembles things correctly,
etc., but it can also detect what mode to use based on a target binary you
give it.

Somebody who's more familiar with how ARM does it could probably give you a
better answer.

Gabe

On Sat, Aug 10, 2019 at 6:56 AM Alec Roelke <ar...@virginia.edu> wrote:

> I'm trying to fix the remote connection to GDB in RISC-V (patch #20028
> <https://gem5-review.googlesource.com/c/public/gem5/+/20028>) and I'm
> running into an issue where RISC-V GDB is sending a breakpoint of length 2
> bytes, but gem5 is expecting one of length 4 (the length of an instruction,
> see line 835 of src/base/remote_gdb.cc).
>
> If I'm interpreting this correctly, the 'length' of a breakpoint should be
> the number of bytes in the instruction on which the program should be
> interrupted, which in RISC-V is 4 for normal instructions but can also be 2
> for compressed ones.  Is my interpretation correct?  If so, how do other
> ISAs (notably x86) handle breakpoints on variable instruction widths?
>
> Thanks,
> Alec Roelke
> _______________________________________________
> gem5-dev mailing list
> gem5-dev@gem5.org
> http://m5sim.org/mailman/listinfo/gem5-dev
_______________________________________________
gem5-dev mailing list
gem5-dev@gem5.org
http://m5sim.org/mailman/listinfo/gem5-dev

Reply via email to