-- Leo's gemini proxy

-- Connecting to remyabel.srht.site:1965...

-- Connected

-- Sending request

-- Meta line: 20 text/gemini

How too many cooks spoiled the C++ broth


One of the paradoxes of C++ is that it's stood the test of time, yet everyone's trying to change it. It's heavily used in game development, the financial industry, high performance computing, embedded computing and pretty much any space where performance is mission critical, but correctness is still desired. C++ went through a renaissance with C++11, a huge jump from C++98 and switched to a more rapid development pace. However, those who have not kept up with C++'s development over the years may be surprised at how turmulous the evolution of the language has actually been.


The way C++'s development works is roughly as follows. There is a standards committee made of many big players including people who work at Apple, Google and generally anyone who has a vested interest in the direction of C++'s future. There are working groups that focus on different aspects of C++, defect reports in the standard library and core language and unofficial draft standards that are slated to be the next official standard. Compilers often will have experimental support for the next official standard and try to actively implement all of the features and technical specifications (known as TS).


This sounds like a good thing, but has resulted in awkward situations like std::regex being completely broken in GCC 4.9.1 and basic library functions like std::make_unique missing the C++11 cut and only appearing in C++14 (requiring a shim for C++11). Your compiler may behave differently depending on what version you use (which is important because people often stay on old versions of compilers for compatibility) and even if you explicitly specify C++11 for example, the compiler may still "backport" future standards behavior to avoid surprises. The standard grows in size at a exponential rate and anyone jumping in will hopelessly be confused. Keeping up with the latest version is one thing, but keeping up with compiler quirks, defects and differences in versions is impossible unless you've been using C++ for years. Why not just stick to the latest version? Because if you get a C++ job (or work on an open source project) you can bet that they will stick to a specific standard version, while users try to compile it with a different standards version.


There is a knock on effect of this. For example, the C++ standard library wants to have ranges and fmt, which were previously standalone libraries, however the versions present in the standard library are heavily gimped. In other words, most people will just use the third party versions anyway. This is also true of mainstay containers in the library where the default performance characteristics are so abysmal that most people will use replacements. As such, many people have made proposals to try to "fix" the language including advocating for a STD 2.0 or ABI break. But the result is a kitchen sink of different programming paradigms borrowed from other languages that make C++ inconsistent and confusing for newcomers.


Recently, many standards committee members resigned and went off to make Carbon because they weren't happy with the direction C++ was going. Others have opted to switch to Rust instead. C++ is not going anyway anytime soon, but it seems to continue to become more and more of a bloated language, when I feel like it should be cutting bloat.



How too many cooks spoiled the C++ broth was published on 2022-09-23


All content (including the website itself) licensed under MIT.

-- Response ended

-- Page fetched on Fri May 10 03:30:06 2024