In 32 bits mode programming I used to employ int 3
in my programs a lot for stopping at a given location with the debugger (embedding the instruction in the source). Now in 64 bits it seems to not be working, producing a very ordinary SIGSEGV under gdb and destroying the program beyond hope ("Program terminated with signal SIGSEGV, Segmentation fault.
The program no longer exists."). I wonder if 64 bit mode has another mechanism, or if I should do some cache-flush (the int 3
is a dynamically generated opcode in this case (0xcc), is some jit-like code).