Why does int 3 generate a SIGSEGV in 64-bit instead of stopping the debugger?

In 32 bits mode programming I used to employ int 3 in my programs a lot for stopping at a given location with the debugger (embedding the instruction in the source). Now in 64 bits it seems to not be working, producing a very ordinary SIGSEGV under gdb and destroying the program beyond hope ("Program terminated with signal SIGSEGV, Segmentation fault. The program no longer exists."). I wonder if 64 bit mode has another mechanism, or if I should do some cache-flush (the int 3 is a dynamically generated opcode in this case (0xcc), is some jit-like code).

18
задан Jolta 13 February 2017 в 11:38
поделиться