Logo

Blog


Using a new C++ standard - The pain of being at the cutting edge

As you probably heard, I wrote a book about C++20 called Programming with C++20. My goal was, and is, to show correct C++20 code as early as possible. Given that even in 2022, only MSVC claims full C++20 support in the compiler and the standard library, we need to have a long breath.

In today's post, I share my experience, which hopefully helps you with your own path to C++20. Despite the slow progress, this standard is incredible and will have a strong influence on C++20.

A programming book is all about code

I teach enough C++ classes to have seen the effect of code examples. Mistakes, errors, and typos are all things that can confuse people. I think this is understandable. We are learning something new. This new is about coding. Then what's shown there must be correct. An English teacher can't effort bad grammar as well (good thing I'm not an English teacher).

That is why I compile all the code examples in Programming with C++20 (okay, maybe with 2-5 exceptions). Thanks to a system based on LaTeX, the same code is included in the book. That's to avoid copy and past errors.

All that sadly doesn't protect from logic errors.

Unit tests to ensure quality

Despite it being a book, I started with unit tests. For most examples in the book, I have tests to verify the behavior. Check that the code compiles isn't enough. The tests verify the logic and ensure that the output does not change with a compiler upgrade.

Each example is compiled with Clang, GCC, and MSVC, on Windows, Linux, and on macOS. GitHub Actions make this possible.

My challenges

I started back in 2020 with the project, right after C++20 was finalized in Prague (which, by the way, was a lovely WG21 meeting). There was not one compiler that implemented all the language or library features at that time. I needed a way to make progress without waiting for the final implementations. As I know today, the book wouldn't be finished otherwise.

What can we do in such a situation? Well, some things can be detected with __has_include. For example, std::format can be mocked with libfmt. It is not 100% compatible, but close enough. I used __has_include to compile examples only with a compiler that supports that feature. All other compilers simply did compile void. That was good enough for me but might not be for a real project.

More complicated

Other items are more complicated, Concepts, for example. Concepts come as a library and a language part. At the time I started, GCC had both a Concepts implementation for the language extensions and the library, the header <concepts>. There is also a feature test macro to detect it in the standard. Sadly that early implementation was buggy. MSVC, like GCC, had both, but the implementation seemed to be incomplete at the time. Clang came along with just the language features. They were more stable, but the Concept header was missing.

Clang seemed to me to have the best implementation in terms of stability and completeness. That made it terrible to test whether the code I presented in the book was correct.

The check for the header did help only partially. The check for the feature test macro was a disaster as they all claimed to implement the language feature but with different progress.

Fail expected

My approach was to tackle this with CMake. I wrote CMake tests that expected the compile to fail! With markers like HAVE_concepts and defines like IS_CLANG I instructed these tests for which target and compiler combination a compile error was expected. That way, I hoped to see when compilers catch up and see that my code was (hopefully) valid, and with that, lift the restricting defines. It worked :-)

Wait, there is more

The fun did not end there. C++20 brings coroutines. Clang had an early implementation because this was the proof of concept implementation during standardization. Clang is always very strict and precise. As a result, the coroutine header was marked as experimental, and those I needed to include <experimental/coroutines>. All types were nested in the experimental namespace. Then GCC and MSVC caught up. As at this point, C++20 was released, I think it was reasonable to put the coroutine header in the normal std folder and namespace. Do you see my pain?

Now I had to check with __has_include for two different header locations. And that is not all. I also had to lift the experimental stuff into namespace std to make the code compatible with the other two compilers.

Sadly, there is still more

Our beloved Lambdas got improved in C++20 again. The changes there made Lambdas even more painful to integrate into my tests. Especially lambdas with a template head. Some compilers could handle them. Others had, of course, a parsing error.

What does -std=c++20 mean, actually?

Checking for the selected standard was a mess as well. For some compilers with -std=c++20, the resulting define of __cplusplus did still carry the C++17 value. Others did correctly state the value for C++20.

Don't we have feature test macros to avoid your mess above?

For a while now, C++ has come with feature test macros (cppreference.com/w/cpp/feature_test). These macros allow us to test the availability or version of a certain feature. At least in theory.

Feature test macros, in general, are a good thing once all compilers have full support for a certain standard, say C++20. As long as they still implement features, the feature test macros are... useless at times.

Another reason for the trouble is that some features come without a feature test macro or one not fine granular enough to check the different implementation stages of the compiler.

The reason is that, for example, feature test macros test for features, not for implemented papers. Lambdas in C++20, for example, can have a template-head. But they are now also useable in unevaluated contexts if they are captureless. There is a feature test macro __cpp_generic_lambdas and __cpp_lambdas. The latter one is not changed since C++11. __cpp_generic_lambdas shows that we have template lambdas, but that's not enough.

Another example is __cpp_lib_concepts. This define tells us that the concepts header is available. But there is no way to detect the progress of the implementation of the header. My trouble was Clang which at one point came with an implementation but without implementing std::invocable... Guess what? I started with another CMake test which tried to compile a piece of code that used std::invocable. Depending on the result, the example was or wasn't compiled.

Are you still curious about what else happened?

Well, there is operator<=>. Aside from the part that this feature as well requires compiler and library support operator<=> is another specialty. The <compare> header's existence can be tested. The contents there are sufficiently little to be implemented at once. However, operator<=> influences existing types, for example, std::string. Certain types like std::string come with the new comparison operator in C++20. That requires changing existing headers, like <string>. Yet, __cpp_lib_three_way_comparison doesn't necessarily tell the status of the other headers. I ended up with faking operator<=> for string in libc++.

Take away

Using a new standard that is still being implemented is painful. However, all my experience above is part of the transition. Once all compilers have proper C++20 support, all my pain goes away. Over time, I reduced and removed my special hacks and tricks.

Andreas