> In your "unsafe" example in Rust, you still don't get undefined behavior.
You get an unexpected behaviour instead. For example, imagine if you had a device for counting amount of bytes transferred over the network. If the counter overflows, you will get the wrong number, and issue an invalid bill to the customer. While that is a "defined" behaviour that is not what one expects.
I expect the counter either to work correctly according to normal math rules or indicate that it cannot perform the task due to hardware limitations.
One might blame the developer but history shows that developers fail to take account of overflow. For example, experienced developers writing Linux kernel made several mistakes, which lead to security vulnerabilities.
In most cases developers need standard math, not operations modulo 2^32 or 2^64. When you count bytes, you need an exact amount, not an amount modulo 2^32. Why do language designers provide modulo operations instead of normal math I don't understand.
Rust has chosen a weird path when the program works differently in debug and release mode. This is clearly wrong. I expect the same program to work the same way no matter whether optimizations are enabled or not.
In your "unsafe" example in Rust, you still don't get undefined behavior.
https://huonw.github.io/blog/2016/04/myths-and-legends-about...