Reading time: 5+ minutes
The purpose of this study is to answer the question “Which would be the better build system choice?”. The input conditions are as follows:
- Target OS for IDE and development: Windows.
- Compilation of cross platform code (target mostly Linux / Unix like systems).
- Using GCC, G++ and / or Clang compiler.
- Predicted amount of code: Several tens of diverse complicated SW components written in C/C++. This means that a single module may have between several hundred and several tens of thousands source files.
- Team split on several international locations.
- When updating a single component the end result of compilation should be updated without the necessity of updating all other already built components (thus reducing the compilation / SW update time).
- It is expected that for some versions of the project multiple targets and several resulting SW bundles might be necessary. In addition calibration binaries or other specific small bin outputs might be required.
- Initially proposed – Qmake and CMake.
- As the team might be big and distributed: adding a new component, or porting one from other system usually require from “some” up to “significant amount of time”. Providing HowTos, rules, instructions, as well as automated makefile generation would reduce significantly the integration time. So automation should be possible.
- Possible need of distributed compilation. This is because it is expected to have many shared libraries, together with multiple big and complex SW components. When we have a whole system of several hundred thousands of files – building the complete bundle from scratch even on an 2×8 core machine with 64GB of RAM (i.e. a powerful workstation) might take up to half an hour (worst case scenario that I’ve seen in practice). On the other hand correctly setup conditional compilation with proper dependencies resolution and nice continuous compilation system like Jenkins might solve potential slow build issues.
- Automatic dependencies resolution. Reason – when using common libraries for the purpose of code unification and elimination of doubled libraries – this is important. In addition many times sub-components depend on core components and headers
- Easy integration in common continuous build integration systems like Jenkins.
- Potential future portability of the build to a Unix / Linux development OS and IDE.
- Apart from the listed needs, a purpose is as well to use build system that is actively developed, used, and will obviously continue improving and growing.
- Further needs might be defined as well.
Checked Build systems
Apart from many other the following were checked:
Nota Bene! The check was not reading only some online comparisons, but as well going to the websites and checking more than a few sources to confirm later final conclusions. In addition I use periodically makefiles, have worked with CMake, Eclipse, Visual Studio, QtCreator, QMake and others. As a result – when I read articles I understand them for real from point of view of experience.
For the potential need of distributed compilation
- DistCC – for providing distributed workstations compilation. Unfortunately it is available only under Unix-like systems. This means if necessary to be used Cygwin setup will be required. It is Open Source, under the Apache License
- https://electric-cloud.com/products/electricaccelerator/ – Proprietary, payed
- https://www.incredibuild.com/ – Proprietary, payed
Build systems key points
Gradle – the choice of Google for Android. Better than Maven, but is based on the Groovy Language, and is mostly used for Java applications. It lacks sufficient deep documentation
Premake – slow evolution, lack of support and community, latest alpha release is over an year old, i.e. potential issues with further development
Qbs – due to some drawbacks of Qmake this was an experiment for new Qt Build System. Current plan is potentially to go to discontinued support.
Qmake – together with qbs both are useful mostly for Qt based projects. Many times there are dependencies of the Qt version. Complex dependencies might be a problem – there are multiple reports on the web for this. Thus Qmake is definitely not the solution.
FASTbuild – this is a potential competitor to CMake. However it was not found enough information on it, though its own documentation is quite good. Nevertheless simply googling results in a lot more articles and discussions on CMake compared to FASTbuild. On Youtube there is only one 10-minute video on FASTbuild compared to CMake – it has several great videos on how to make the things right (with length of an hour or more) – and those are from CPP conferences.
Meson and Ninja – Claimed to be faster than make, practically not quite faster for complex projects. Meson is with custom Python-like syntax.
In contrast to Make, Ninja lacks features such as string manipulation, as Ninja build files are not meant to be written by hand. Instead, a “build generator” should be used to generate Ninja build files. CMake and Meson are popular build management software tools which support creating build files for Ninja. Ninja is used for Google Chrome and Android, and is mostly powered by Google. In addition it is used by some part of the LLVM developers. So for the moment it is not a regular embedded and special build system, but mostly a regular PC Application Build system.
Meson might be a possible solution. However this article explains why CMake wins against it.
So – CMake wins. Here are most of the reasons:
- This is the oldest of the tools presented in this list – initial release is in 2001.
- As it is the oldest, and as it continually improves – CMake is widely used. In addition:
- After the introduction of the server mode it got native support by QtCreator, CLion, Android Studio (NDK) and even Microsoft Visual Studio. Native means that you do not have to generate any intermediate project files, but the txt is used directly by the IDE.
- On the community spread side we got e.g. KDE, OpenCV, zlib, libpng, freetype and as of recently Boost. These projects using CMake not only guarantees that you can easily use them, but that you can also include them in your build via add_subdirectory such that they become part of your project. This is especially useful if you are cross-compiling – for instance for a Raspberry Pi.
- There were conceptual issues in the past, but starting with version 3.5 since April 2016 most of them were solved.
- As CMake continues to be extremely widely used (there are claims that it continues to be the most widely used build system!) – more people might be familiar with it. This is important when searching for collaborators, partners, new employees, etc.
- There are books and a lot of tweaks and examples on the internet for build speed optimizations. There are as well such for how to make correct architecture of the build to achieve:
- Correct dependencies resolution
- Decrease build times
- How to build hierarchy correctly
- How to make such syntax, so that later addition of components to be fast and easy
- How to include components
- How to import and export header files listings
- A lot other nice features.
- There is as well great online documentation and even videos on CMake tweaks.
- CMake is supported by default in the widest number of IDEs as native / built in / choose-able by default make system.
- CMake and MinGW are practically meant to work together.
- CMake ensures cross-target compilation.
- CMake allows addition of external commands.
- CMake projects are completely transferable to other platforms.
- The CMake platform itself is open source and is actively supported by its owner – the company Kitware. Kitware itself is involved in other big Open Source initiatives for libraries and projects. This means – they have the purpose of firmly continuing on the same route –improving CMake constantly and keeping it open source.
Finally – despite that many of these features are partially or even fully supported by other platforms – CMake continues to be one of the best possible choices. All those characteristics are not present together in other platforms. There are few missing features like e.g. parallel workstation building. But thanks to the improvements and constant development – it might be relatively easy to achieve what you want with it, even submit a report or request, or even prepare one by wourself. And for the parallel build on multiple machines – on Unix like systems there is the DistCC tool already.
This study was make in January 2019. In the future the situation might change. Or if you have different prerequisites. I hope this article was useful for you.
Why CMake is practically not useful for small components
I have made my initial investigation of the Build Tools driven by a simple request – I was asked which build tool I would recommend for a complex system of approx. 16 components, each consisting potentially of tens and hundreds of files. During my work it turned out that we will be working with generated code components, and that those will consist of not more than a few sources and 10 header files each.
For those of you, that are not familiar with Simulink, Targetlink, Matlab and other code generation tools – those can implement complex graphically designed algorithms in single or multiple files.
Normally the code is MISRA proved, so one should not bother of code quality, or e.g. handling of default cases in a switch statement. Those are handled by the Model developers, and the tools usually have some reporting of problems in the Model design.
So – I started working with CMake and it was quite easy for me to devise a simple CMakelists.txt file with few directives that generated makefiles. Those had inside the compiler definitions and general build variables. However – my team leader was a bit against CMake (without explanation and knowledge on why), and asked me to write the makefiles either by myself, or better – use the Eclipse CDT plugin internal Makefile generation. I was quite fond on CMake, but found out the following:
- For a simple sample component with two short sources and one header (total under 50 lines of code) CMake generates additional subfolder with total of 90 files in 17 folders
- The generated makefiles had over 10 recipes each, most of them unnecessary for the simple target of generating a Windows console executable.
- The additional level of abstraction between the CMakeLists.txt and the compilation with complex generated parameterized makefiles was not helping. It was actually a bit hiding of what exactly happens inside the generated make system.
In addition while I was attending the Embedded World 2019 Fair in Nürnberg I spoke with several colleagues on the matter. One of them was a lead engineer in Antmicro – an international company for providing amongst other products – BSPs (Board Support Packages) and custom Linux builds. He was one of the main people to devise SW packages and I asked him about his opinion, as well as I shared with him my experience. He said – yes, you are right – Cmake is a great tool, but doesn’t help for small amounts of sources.
This guy have made over several tens of Linux and Yocto based BSPs and custom Linux builds in the last few years. He added – for any libraries that come packed with CMake – it is unpractical to abandon it. For any complex library or SW package – it is really a great tool. For any small library or package – it is non-sense to make the build based on CMake. He did the same that I did – at any moment when he had a small component – he wrote the makefiles himself.
In conclusion – as you can see – despite that in this article I explain why CMake is one of the greatest tools around to automate build system generation – for small components it is always better to write the makefiles by yourselves.