Conan just works (not only on the M1)

frogarian

As described in part 1 and in part 2, I was curious to see how much work it would be to use conan on the M1 to compile the current software stack I use.

This should have been the 3rd and final part, but it turned out (spoiler), conan is awesome but not all packages are. To have some more text to publish than just Conan just works, here a little bit more info about my interest and tasks in this topic.

(Use conan frogarian to generate that image →
in the terminal)

Use the right tool for the right job

C++ package management is a complex topic. Often the question about how to do it best is answered with: It depends. Since you are never wrong with that answer. :-)

So let’s have a quick look at the current task.

What is the job

Build some C++ SDK. The SDK is some written code and uses some external third-party dependencies. These are libraries, but also tools for building like the Android NDK. Having these dependencies consistent on developer machines and build agents is important.

All that for the following build matrix.

Build machine Target OS/Arch

Darwin x86_64

Darwin x86_64

Darwin x86_64

iOS x86_64

Darwin x86_64

iOS armv8

Darwin x86_64

Android x86

Darwin x86_64

Android x86_64

Darwin x86_64

Android armv7

Darwin x86_64

Android armv8

Linux x86_64

Linux x86_64

Linux x86_64

Android x86

Linux x86_64

Android x86_64

Linux x86_64

Android armv7

Linux x86_64

Android armv8

This sums up to quite some builds. The reasons for some combinations are not obvious, so let me explain.

iOS

If you develop for iOS then OS an the hardware is given. Mac Hardware does not scale. You can build a Linux box with 64 CPUs, if wanted, but for the Mac universe you are limited in what is available on hardware, and these are usually Mac minis as build agents.

Android

Android is targeted 2 times, from Mac and Linux, since both toolchains must work.

The Android x86 and armv7 are basically useless. You can and should run the x86_64 simulator, and armv7 is not relevant. But since we ship not apps, but SDKs, and users still have settings for those archs we need to support them. Users can be annoying ;-)

The rest

And then there are of course also the builds for Linux and OSX/BigSur. And there is a reason why I do list them in 2 columns, like they would cross-compile. Since they do, in a way. You need build tools, and you do builds. These are 2 different things. If you build your library in debug mode, the build tools used will most likely not share that mode.

The M1 future will add …​

Build machine Target OS/Arch

Darwin arm64

Darwin arm64

Darwin arm64

Darwin x86_64

Darwin arm64

iOS armv8 (device)

Darwin arm64

iOS armv8 (simulator)

Darwin arm64

iOS x86_64 (simulator)

and once possible, maybe also

Darwin arm64

Android armv8

Darwin arm64

Android x86_64

All those details are not sure, and some might be optional. But if the future requires the support of Intel Macs, cross compiling from them on the M1 might be an option.

The reason why you do this from the M1 to x86_64 and not vice versa is that you can test the x86_64 builds on the M1, but not the arm builds on Intel.

And there might even be Windows

Build machine

Target OS/Arch

Windows x86_64

Windows x86_64

There is a chance that the software is useful enough to be even used on Windows, so this is conceptually prepared, and also tested to work. Windows made huge improvements in its C++ support over the last years, but it still can be extra annoying.

If Windows becomes a topic even the Android builds will run on Windows, but I omit to list them. The point is, flexibility is important.

Binary package management to the rescue

These are a lot of builds. And there is some 3rd party software, some with pretty interesting build systems. Building all the third party dependencies takes about 10 times longer than the actual software. Not a scenario where build everything always from the source is an efficient option.

Now I know, some people do build everything always from source, maybe with a huge mono repo. But they often have a binary caching mechanism hidden in their custom build system under the hood and can solve problems by throwing money on them. Or they have a lot of time, I don’t know.

For me, with the need to be able to package what ever software is required, dedicated binary dependency management is the best option. After all, the job of developer and CI/CD pipelines is not to build software that does not change over and over again but to provide feedback about the software that does change as fast as possible.

Conan is the best option for this job

For this task there is a huge build matrix and flexibility is required. Binary packages shall be available via a central repository, not just for library dependencies but also for build tools. And the whole thing shall 'just_work'

Tools and dependencies shall be integrated into the project so they work and mirror the state of the release build onto developer machines, but developers shall be able to make local changes. For example to tests new versions,

Conan has some learning curve but once you manage that, for a scenario like the one above, it is definitely the best option. Going into the details about the why might become a own blog post one day.

In the context of the M1 story, the important part is how easy the addition of a totally new platform was with conan.

While conan is awesome, not all software is

Something (I find) funny to the end.

There is software out there with build integration where not even the best tool can help you.

One library in our build stack is some IoT protocol library that has its future in the past. Classical legacy stuff with a very interesting usage of scons.

The build on the M1 with the OSX SDK says: Hey, you are building for Arm, you must mean the iPhone, I fix that for you. Then XCode steps in and says: Hey, you are building iOS app for BigSur, I will help you and make that a MacCatalyst build.

And you end up with something that compiles, has even kind of working binaries, but is totally not what you want. It’s magically that such a confused build can produce something that is in theory even executable :-)

Cato