A lot of people think that names of existing library stuff are magical. Any better stuff has to have the same names as the old stuff, so we have to break ABI so new better stuff can keep all the same names.
But the names we have are not magical. New stuff can have new names (or same names, in a new namespace). Then the old names won't be magical anymore, they'll just be old. We've seen it already, with old (monadic) lock_guard and new (variadic) scoped_lock. Agonized hand-wringing, and then, in the end, just no big deal.
We might also end up with a second build target for our favored OSes, on top of the i32 and x64 ones we have now, to get an option for better performance and some new runtime choices. It will probably happen first in embedded ARM, with several choices competing, and then one of them could take hold in more mainstream environments. Again, no big deal.
Don't get involved in the hand-wringing and pearl-clutching, and you will hardly notice it.
It'd take an ABI break to implement, but wouldn't using inline namespaces under std mean backward compatiblity is possible in the future? You could have std::cpp20::string, for example, which is in an inline namespace std::cpp20. The older versions could be in std::cpp17 etc.
In fact, this could be done without an ABI break if new versions of symbols are in sub-namespaces from the get-go, and the versions in std:: are deprecated, I.e std::containers::map vs std::map
> It'd take an ABI break to implement, but wouldn't using inline namespaces under std mean backward compatiblity is possible in the future?
Indeed. It seems that the author of the article is very opinionated about what he feels should be changed in the standard, but tries to pass off catastrophic descriptions of what will be the consequences of not having his way with the c++ standardization effort.
Objectively, the author defended a bunch of backward-incompatible changes that would force a break in the ABI and create major problems to C++ users just to accommodate minor changes to the standard. Sounds like the C++ standardization committee clearly made the right call, and at worse nothing of value was changed. Yet, the problem here is not the ABI but how minor changes were being pushed in a way that caused major breakages. For example, why is the author pushing profound changes to std::regex, such as changing the underlying data type, and still insisting in not presenting them as a new, separate component? In fact, it would not be the first time this path was taken, as even the author states that scoped_lock was added as a separate component to avoid the mistake of proposing changes to old components that would create ABI breaks.
Frankly, the author sounds a bit petty and with an axe to grind, and lacking in honesty with regards to his complains, as is demonstrated by his choice in stating thestandard library has died just because his pet proposals, all minor and inconsequential but that would lead to major breakages and an extensive list of backward compatibility problems, were not accepted.
That's sorta the price to pay for having multiple versions of symbols. I'd take having overloads where needed for older ABI versions over linker errors. Also, I imagine you wouldn't necessarily need overloads for all versions unless there were API differences or you explicitly needed to support multiple ABIs. Most of the time you could probably just use the name without qualifying the inline namespace. Conversions in the library could also help, especially if the compiler can see through them to optimize them away/do them at compile-time.
The main issue I can see is header files actually, since even if your library was built using the std::v2::blah types, anyone that includes the header referring to std::blah could potentially get a different version. Modules do solve this, though, as long as your module interface is built as the same version of the library.
I can think of some automated fixes to the ABI problem. I'm sure others can too and also why they'd be rejected.
Personally I'm surprised people are so scared of ABI breakages when the author correctly points out most code relying on stable ABIs when the language does not guarantee one is probably faulty. I guess the lesson here is not to ever buy into a dependency without getting a license to the source to recompile it in the future.
But I get it. There's way too many shared libraries that link against libc++.so out there that will never be recompiled and the source is unavailable. The fixes are probably going to be ugly and have some cost.
This may be a naive question from someone outside of the ecosystem, but what is the purpose of ABI compatibility for an open-source library? Why not recompile the standard lib as a part of your own binary? Is it shipped by the OS as a DLL for sharing purposes?
That's pretty much the issue; linking against libraries you can't/don't want to rebuild that may have been built with different versions of those symbols. The standard library is shipped as a DLL/shared object on pretty much every OS.
A lot of people think that names of existing library stuff are magical. Any better stuff has to have the same names as the old stuff, so we have to break ABI so new better stuff can keep all the same names.
But the names we have are not magical. New stuff can have new names (or same names, in a new namespace). Then the old names won't be magical anymore, they'll just be old. We've seen it already, with old (monadic) lock_guard and new (variadic) scoped_lock. Agonized hand-wringing, and then, in the end, just no big deal.
We might also end up with a second build target for our favored OSes, on top of the i32 and x64 ones we have now, to get an option for better performance and some new runtime choices. It will probably happen first in embedded ARM, with several choices competing, and then one of them could take hold in more mainstream environments. Again, no big deal.
Don't get involved in the hand-wringing and pearl-clutching, and you will hardly notice it.