My criticism stems more from C++'s steadfast refusal to drop backwards compatibility, in any way, for anyone, ever -- while also adding new features. What this means is that new features can't provide the guarantees they can in other environments leading to "ruined fresco" [1] syndrome.
Concrete example: std::move. Move constructors can copy, and `std::move` doesn't move. Naturally, it just casts your T onto `std::remove_reference_t<T>&&`. Because why not. It also leaves your source object in an undefined but totally accessible state -- whose validity is up to the implementor's convention! I think std:: collections are totally useable after they've been moved (correct me if I'm wrong) but your own types may just explode or fail silently. Talk about a giant footgun.
This approach leads to poor imitations of features from other languages getting stacked on top of the rickety footbridge that is K&R C.
It's specifically the evolutionary design philosophy that I take issue with.
The language has become borderline impossible to reason about. Quickly, what's the difference between a glvalue, prvalue, xvalue, lvalue, and an rvalue?
And the compiler, in the name of backwards compatibility, sets out not to help you because adding a new warning might be a breaking change. I've got easily 15 years of experience with C++ - granted, not daily or anything. To figure out what's actually happening, you need to understand 30 years of caveats and edge cases.
> My criticism stems more from C++'s steadfast refusal to drop backwards compatibility, in any way, for anyone, ever -- while also adding new features.
Languages that break backwards compatibility tend to have very slow uptake of the new versions. Python 3.0 was released in 2008 and took at least a decade to become the main version. And the changes made to Python were minor compared to what would need to be done with C++.
> The language has become borderline impossible to reason about.
This I agree but mostly it doesn't affect casual users of the language. I drop into C++ every 5 years or so and I don't find it difficult to understand or be productive. I have no idea what the difference between glvalue, prvalue, xvalue, lvalue, or rvalue but it's mostly not a concern for me.
As someone who creates production code in assembly, C, C#, and Java (among others), but who doesn't have that much experience with C++:
C++ certainly seems like a fragmented language from the outside. Lots of features added over the years to address problems with safety and provide additional "zero overhead" abstractions. The style and idioms of code written in this language seems to have changed pretty significantly over its lifespan. So breaking backwards compatibility to throw out old standards and force programmers to utilize new ones seems to make sense. However, it raises a few questions.
1) Who decides which parts of the language to throw out and which to keep? How do they decide this? Would the goal be to keep the multi-paradigm concepts, or re-focus the language? Which of the "zero overhead" abstractions should be kept?
2) Has this already been tried before in essence? There are certainly a number of languages out there that seem to strive to be "a better C/C++". What benefit is there to attempting to create a C++ 2.0 instead of using one of them?
3) Do the benefits of breaking backwards compatibility really outweigh the loss of all of the accumulated libraries and all the software of the past 30+ years? Even with ideal management of the new language, would it be enough to bring people to a new version?
4) Do you continue adding to this new version as you did with the previous one... surely that would eventually lead to the same fragmentation seen in the current version.
5) What happens to C++ 1.0 in this case? Do you continue to support and expand it? For how long? I suppose one could look at what happened with Python, but I'm not so sure it's that comparable.
If it wasn't backwards compatible then it would be a new language.
Compilers keep adding new warnings all the time. If someone's build is broken by -Werror then they should disable that warning, if it isn't relevant to them.
Certainly the C committee considers standardizing warnings to be a breaking change so that's not always true. [1]
Re: backwards compatibility, that's not really true. ABI compatibility is different than source-level compatibility. If a library or module is built to one language standard, so long as the ABI remains compatible, I think it's fair game to change syntax and semantics when compiling with a newer language release - especially when there's clear and obvious deficiencies in the existing. Obviously, the committee and I disagree on this.
However, my point remains that if you value backwards compatibility above all else, and it's that backwards compatibility that actually prevents you from adding features in a complete and honest way, maybe don't add the feature. Like, if `std::move` is the best you can muster, don't add it! It's not a move! I don't know what it is, but it's definitely not what the label on the tin says.
Backwards compatibility is the reason why C++ became what it is today, and why it prevailed over other (similar/better?) languages designed at the time. Herb Sutter himself discusses this in the talk here: https://herbsutter.com/2020/07/30/c-on-sea-video-posted-brid...
> but your own types may just explode or fail silently.
I mean, that's the point of them being "your own types". If you couldn't do anything that you want, including putting `assert(1 == 2)` in any method of your own type in C++... then people would be quick to design a Cwhatever language where you can, because it's a useful subspace of the design space of programming languages
Concrete example: std::move. Move constructors can copy, and `std::move` doesn't move. Naturally, it just casts your T onto `std::remove_reference_t<T>&&`. Because why not. It also leaves your source object in an undefined but totally accessible state -- whose validity is up to the implementor's convention! I think std:: collections are totally useable after they've been moved (correct me if I'm wrong) but your own types may just explode or fail silently. Talk about a giant footgun.
This approach leads to poor imitations of features from other languages getting stacked on top of the rickety footbridge that is K&R C.
It's specifically the evolutionary design philosophy that I take issue with.
The language has become borderline impossible to reason about. Quickly, what's the difference between a glvalue, prvalue, xvalue, lvalue, and an rvalue?
And the compiler, in the name of backwards compatibility, sets out not to help you because adding a new warning might be a breaking change. I've got easily 15 years of experience with C++ - granted, not daily or anything. To figure out what's actually happening, you need to understand 30 years of caveats and edge cases.
[1] https://www.npr.org/sections/thetwo-way/2012/09/20/161466361...