Oob [] is "undefined behaviour".
You don't build reliable software on "undefined behaviours".
There are good reasons why we have undefined behaviors in C++
Let's check these
And I promise you they're not for reliability.
If you define what compiler should do for every user error, you're not allowing to use low level hardware implementations - which may differ between platforms. Java floating point numbers come to mind - they're not using hardware FPU on some platforms, because specification says exactly what should happen if user does something stupid
Having done lots of fpu determinism-related code, I can assure you that a strict IEEE754 should be very desirable by default, and could be relaxed as a pragma or an opt-in.
Have you got other examples than floating point ?
There are also some compiler optimizations that cannot be used if you want to have consistent behavior for incorrect input or buggy code, or you cannot make reasonable assumption that array will be accessed within bounds.
Lots of people STRONGLY disagree on this.
I've been doing high performance code for decades and I want a safe default, then add optimizations later.
Signed overflow as undefined, having no array bound checking, "(int32_t)v << 32" come to mind immediately.
These people just want to look good in benchmarks. They shouldn't drive planes. Not even the internet.
Rust just does it much better. 0 cost abstractions, safety by default (apart for a few items) and the ability to write unsafe code when you can prove its correctness (or you're adventurous).
Caveat : I don't have 20+ years of Rust experience.
The C++ philosophy is to allow programmers to use reliable high level algorithms (like those included in <algorithm>) without performance penalty, so they won't be tempted to come up with their own "better" low level solutions. Many people do not agree with that approach, and that's why Java, C#, Go, Rust were invented.
This is desirable indeed. Composability is good. Yet we have std::list::sort
And what's bundled in algorithm is only a tiny subset of what you'll need in a big program.
I don't really see how Rust does it in a worse way than C++.
For those that want to know more - presentation from CppCon about undefined behavior in C++, and why C++ ISO committee cannot remove it:
Of course they cannot remove it without hindering everything's performance in the wild.
But they're not even trying. Tey they break things when it suits them (std::function allocator disappearing in c++17/c++20 for instance).
Btw the video says the std library should use ssize_t everywhere and not size_t. I disagree with this, but you can't defend the video and C++ on this topic for instance
Edit/PS: I was young and reckless too, but now I'm convinced you can't trust programmers with C and C++ for important/reliable/safe code.