Re: [CMake] [cmake-developers] Need ideas/opinions on third party library management
On 17-Aug-16 04:29, Florent Castelli wrote: The Boost source download is cached outside of the build directory in a unique folder. So it’s effectively only done once for all platforms and then reused. This is true for local machines and for custom build servers like your personal Jenkins. For Travis/AppVeyor you have to create root folder with 3rd parties from scratch for each build (at least for free public accounts). Yes. If you're using a free shared service, that's not something you can count on. If you host your CI, you can do neat tricks like this, use ccache or other similar techs. What if you don't need "tricks" and make everything plain and simple? You’ll also have symbols and the sources available for debugging properly, and they’re not always available with a binary distribution. Just to clarify: with Hunter you're creating binaries from sources, so everything you install with `cmake --build _builds --target install` (or `./b2 install` for Boost) is available. So you build each dependency separately, install them, and then use them in your top level dependency. No, in shared root folder. That works, but you have the extra overhead of the instrumentation for the whole build. Even a no-op build will have to recurse in the build folder for each dependency and run again, which is slow. This is why I prefer Ninja to the Makefile generator: it has a global view of everything and you get a quick no-op build. Not sure I understand that. What is the overhead? Of course building from source is not an option for such monsters like Qt or OpenCV. Ready-to-use binaries is something critical for real life applications. There is no way to test everything on Travis/AppVeyor without this feature. Well, you don’t have to use Travis or AppVeyor. It's the most powerful and easy to setup services I know. If you know any better solutions please share. Well, you don't have a full control on the environment, so I wouldn't say it's the most powerful. Please share your solution. I have worked with Jenkins before and will not say that it's something easy customizable. Shareable folder - yes, good, but other stuff is a complete pain. You can setup Travis in a seconds, add AppVeyor and you got Linux, OSX, Windows testing. How much time will take to start and tune Jenkins master and connect several slaves with different OSes on them? Then add bunch of projects and tune them, then create dev branches for testing and so on. Convenient for sure. It probably fits smaller structures very well. Bigger companies have the resources to host their own service most of the time and requirements that force them to do so. Those are probably the ones that will have the manpower to handle a super-build type build system. Why do not have both? Hunter can share downloads/root/cache on local folder for such CIs like Jenkins/custom so you don't need to rebuild everything. At the same time binaries from server can be downloaded for "build-from-scratch" environment like Travis. Anyway what about users? So you think it's okay to spend several hours of building from sources to just to run simple example of your product? Spotify isn’t at the same scale as most projects hosted there and we have different requirements and resources. Admittedly, Spotify doesn’t use Qt anymore, so this isn’t a problem for us. It's not about Qt, it's about expandability. Use 20 of smaller libraries and you will have quite the same issues. As I said before, if I have build scripts for 20 small libraries and I want to update a build flag affecting the ABI, I don't have to do anything but just change the flag once. In your case, you'll have to tweak the 20 build scripts for each library to reflect it. The dependencies are only intermediate products you use for the final one. I don't want to deal with them constantly. It's not true, I don't need to tweak 20 scripts, I just need to tweak one toolchain file. Note that by integrating everything in the same project, you also have proper dependencies and you will only build what you need. You may save some time by doing that. And caching is important, when done in the right way! With Hunter you're installing only what you need too, it does respect options like FOO_WITH_OPENSSL or FOO_WITH_QT, it download only binaries for toolchain you're currently working on, etc. Don't want to make a discussion too broad. You said that it's hard to manage binaries for a lot of configuration, I'm saying that it's possible and is very handy. I'm not saying it's impossible. I'm saying the overhead of managing binaries is just a burden we shouldn't have to accept in the C/C++ world. If you can build build everything from source all the time in a super-build, why wouldn't you do it? Because it's not practical. I have such experience with Gentoo, I prefer do something useful instead of watching on "emerge world" progress. Super-build doesn't scale, what if
Re: [CMake] [cmake-developers] Need ideas/opinions on third party library management
On 17-Aug-16 08:36, Elizabeth A. Fischer wrote: > > I don't think CMake is the best place to do it, > Can you provide any details? I personally think that CMake is a > natural and the only place where it should be done. The most important reason here is because there are combinatorially many versions of a package you COULD install, depending on what versions of its dependencies you link with, and CMake provides nothing to address that issue. CMake provides an abstraction. "Slots" that you need to fill: if(FOO_WITH_TESTS) # need to have GTest installed find_package(GTest) endif() if(FOO_WITH_OPENSSL) # need to have OpenSSL installed find_package(OpenSSL) endif() And it should drive package manager. At least I find that approach natural and convenient, see no problems with it. You can have as much combinations of versions/options/dependencies as you need: * https://docs.hunter.sh/en/latest/overview/customization/hunter-id.html * https://docs.hunter.sh/en/latest/overview/customization/config-id.html See here for an overview of how Spack addresses the combinatorial versioning issue (which no other auto-builder does, to the best of my knowledge): http://hpcugent.github.io/easybuild/files/SC14_BoF_Spack.pdf That's what I was talking about. I think that there is no need to introduce new funky syntax like "spack install mpileaks@1.1.2 %gcc@4.7.3 +debug". We already have CMAKE_CXX_COMPILER and CMAKE_BUILD_TYPE/CMAKE_CONFIGURATION_TYPES. Version can be set by CMake options too. Effectively you can do: option(FOO_STABLE_BUILD "Stable build" ON) option(FOO_EXPERIMENTAL_BUILD "Experimental build" OFF) if(APPLE AND IOS AND FOO_STABLE_BUILD) hunter_config(BooPackage VERSION "1.0") endif() if(WIN32 AND FOO_EXPERIMENTAL_BUILD) hunter_config(BooPackage VERSION "2.0-beta" CMAKE_ARGS BOO_NEW_STUFF=YES) endif() Once you've built something, it's nice to be able to re-use it. If I have a top-level CMake project that automatically builds three dependencies, will other projects be able to make use of those dependencies I've built? Yes, libraries should be installed to the shared root, not to local folder: * https://docs.hunter.sh/en/latest/overview/shareable.html Or do they become private? No. Though you can make it private by setting CMake variable. It will use separate directory just like virtualenv do. If libraries cannot be shared between applications, you will get a LOT of library bloat, especially among the low-level libraries that get repeated numerous times. Admittedly, this may not be such an issue in some environments where people are really only focused on building one thing. If you make a project, you might see it as a "top-level" project. But someone else might want to build something bigger on top of your work. You can never assume that "this package is top-level and no one will ever depend on it." No issue here, see notes above. Another obvious problem with using CMake for everything is that not all packages come with CMake builds; most do not, in fact. Even if we CAN re-write all the buils into CMake, that is a lot of extra effort. As Florent has discovered, upstream authors do not always see a CMake build in a favorable light, and these re-worked builds are not always as functional as the original. Moreover... writing a Spack recipe is an order of magnitude easier than writing a CMake build. Usually, it's just a matter of calling `configure` or `cmake` with the right options. Again, converting to CMake is a best option, but not the only possible one. E.g. OpenSSL, Boost, autotool-based package like X11 can be used as is: * https://github.com/ruslo/hunter/blob/b4c370e32798cc3da74c37e4156c3bfc77add379/cmake/projects/Boost/hunter.cmake#L21 * https://github.com/ruslo/hunter/blob/b4c370e32798cc3da74c37e4156c3bfc77add379/cmake/projects/OpenSSL/hunter.cmake#L17 * https://github.com/ruslo/hunter/blob/b4c370e32798cc3da74c37e4156c3bfc77add379/cmake/projects/x11/hunter.cmake#L20 Although we can maybe imagine a world in which everyone eventually abandons Autotools for CMake, it is still not realistic to expect that Python, Haskell or Java programs will ever come with CMake builds. This would be OK if each language exited in its own silo. But they don't. Python packages (built with setuptools) routinely depend on C-based packages (built with Autotools or CMake). By being agnostic to the build system, auto-builders (like Spack, Macports, HomeBrew, etc) are able to make packages work together, regardless of the build system chosen for each one. That's exactly what Hunter do, but using CMake as a driver. In the sense that CMake is a Turing-complete language, there's no fundamental reason you CAN'T write a meta-builder in CMake. But gosh... the language sure is arcane (but still better than Autotools by a long shot). I like to imagine that if CMake were starting off today, it would be written in Python. Language is a
Re: [CMake] [cmake-developers] Need ideas/opinions on third party library management
Well, I tried upstreaming the new build scripts to some projects and it didn’t go well. Some of the reasons I’ve heard of: > I installed CMake 2.8.6 five years ago and I don’t want to update yet > again! People relying on old versions is quite common and any attempt > to raise the min version will be frowned upon (see the discussion in > the LLVM mailing lists for example). Spack is really good at installing dependencies, and makes this a LOT easier. In your Spack recipe, you just tell it which version of CMake your package needs. If Spack hasn't already built that version, it will download and install it for you. Building packages by hand, and configuring their dependencies, needs to go the way of stone spears. > We prefer to use autotools and don’t want to have to learn CMake. > That’s fair. But also, no one likes to build an autotools backed > project for Android or iOS. I suppose it's fair. But a Google search of "convert CMake to Autotools" results in 9:1 stories of people abandoning Autotools for CMake. Except for the fact that it works well for users, I can't say enough evil things about Autotools. Part of the benefit of Autotools is it "just works" without requiring the user to install anything. This benefit is of little value once you move to an auto-builder like Spack. The days when you can get any interesting software to work without installing a zillion dependencies first are long gone. > I’ve never heard of Spack before. It looks better than other solutions > I’ve seen before. The great and unique thing about Spack is it can install a zillion versions of each package. For example... if Package B uses MPI, I can build B-1.7 two (or more) times --- once with OpenMPI and once with MPICH. And I can install them side-by-side. If you change any of the dependencies of a package, Spack will see that as a new and separate version. Most auto-builders let you build one software distro, with only one build of each package (or sometimes one build per numerical version of the package or compiler or something). Spack's versioning is a lot more powerful. > But you still have to manage all the options from your build script Not sure what you mean by this. True, there is some redundancy building code. First you put the options and dependencies in a package's CMake build. And then you put them into the Spack build again. Some things could be simplified if we assumed our CMake-based packages would only ever be built with Spack. But we still need to create CMake-based software that can be installed by hand. Hence the redundancy between the CMake build scripts and the Spack package. In practice, this has not been the end of the world. Another nice thing about Spack is there is no difference between your libraries and Third-party libraries. > and publish the binaries somewhere. In its original incarnation, Spack builds from source. It does not publish or install from binary distros (because the build you asked for, with all its dependency variants, is likely not a build that's ever been built before). There's currently work on a project to use Spack to produce binary RPMs, and maybe other forms of binary distribution. > Then you need to teach your build scripts to get the right version. Your build scripts know nothing about Spack. Spack is an auto-builder that sits ON TOP of your build scripts. > I won’t trade my builds from source for a set of prebuilt binaries anytime soon I think :) Spack builds from source, it is not prebuilt binaries. > > I don't think CMake is the best place to do it, > Can you provide any details? I personally think that CMake is a > natural and the only place where it should be done. The most important reason here is because there are combinatorially many versions of a package you COULD install, depending on what versions of its dependencies you link with, and CMake provides nothing to address that issue. See here for an overview of how Spack addresses the combinatorial versioning issue (which no other auto-builder does, to the best of my knowledge): http://hpcugent.github.io/easybuild/files/SC14_BoF_Spack.pdf Once you've built something, it's nice to be able to re-use it. If I have a top-level CMake project that automatically builds three dependencies, will other projects be able to make use of those dependencies I've built? Or do they become private? If libraries cannot be shared between applications, you will get a LOT of library bloat, especially among the low-level libraries that get repeated numerous times. Admittedly, this may not be such an issue in some environments where people are really only focused on building one thing. If you make a project, you might see it as a "top-level" project. But someone else might want to build something bigger on top of your work. You can never assume that "this package is top-level and no one will ever depend on it." Another obvious problem with using CMake for everything is that not all packages
Re: [CMake] [cmake-developers] Need ideas/opinions on third party library management
On 16-Aug-16 16:37, Florent Castelli wrote: Well, I tried upstreaming the new build scripts to some projects and it didn’t go well. Some of the reasons I’ve heard of: - Windows developpers don’t use CMake, they have project files on the repository. The CMake files for Windows will never be updated. They can coexists, it's easier then maintaining forks. If only C++ code changed you got new version "for free". - I installed CMake 2.8.6 five years ago and I don’t want to update yet again! People relying on old versions is quite common and any attempt to raise the min version will be frowned upon (see the discussion in the LLVM mailing lists for example). You can add `if(CMAKE_VERSION VERSION_LESS ...)` condition. It's hard to support such hairy configuration but anyway. - We prefer to use autotools and don’t want to have to learn CMake. That’s fair. But also, no one likes to build an autotools backed project for Android or iOS. Just for your info Hunter use build scheme for autotools project: https://github.com/ruslo/hunter/blob/b4c370e32798cc3da74c37e4156c3bfc77add379/cmake/modules/hunter_autotools_project.cmake It can create universal iOS libraries and works for Android. There are a lot of efforts made by Alexandre Pretyman so this can be possible, it has some peculiarities. I guess he can clarify anything if you need details. Ruslo -- Powered by www.kitware.com Please keep messages on-topic and check the CMake FAQ at: http://www.cmake.org/Wiki/CMake_FAQ Kitware offers various services to support the CMake community. For more information on each offering, please visit: CMake Support: http://cmake.org/cmake/help/support.html CMake Consulting: http://cmake.org/cmake/help/consulting.html CMake Training Courses: http://cmake.org/cmake/help/training.html Visit other Kitware open-source projects at http://www.kitware.com/opensource/opensource.html Follow this link to subscribe/unsubscribe: http://public.kitware.com/mailman/listinfo/cmake
Re: [CMake] [cmake-developers] Need ideas/opinions on third party library management
On 16-Aug-16 17:04, Florent Castelli wrote: On 16 Aug 2016, at 15:29, Ruslan Baratov> wrote: On 16-Aug-16 13:52, Florent Castelli wrote: For example, Boost is used by 5 platforms: Windows, OSX, Linux, Android and iOS. Each platform has a different CPU target (or many 32/64bit, x86/ARM). Each platform has many compilers. Some platforms have instrumentation options (Debug / Release, ASan, MSan…) and really need to be compiled properly, otherwise you’ll end up with false positives. The matrix of builds is REALLY hard to track. Each time we update Boost, we had to update a lot of things. Not a problem for Hunter. Linux, OSX, Windows, iOS 9.3, iOS 8.2, Android, GCC, Clang, ASan, LeakSan, ThreadSan, Static Analyzer, libstdc++, libc++, MinGW, Visual Studio 2008-2015: * https://travis-ci.org/ingenue/hunter/builds/140317830 * https://ci.appveyor.com/project/ingenue/hunter/build/1.0.665 This list is not even full, I guess I can add more toolchains in future (GCC variations and C++ standards). To test all matrix I need to push one commit to pkg.boost branch, to upload binaries to server I need to push one commit to upload.boost branch (upload ALL toolchains at one shot). To reuse all updates users just need to set new URL/SHA1 of HunterGate module: https://github.com/ruslo/hunter/releases Overall, building boost takes 10s on our developers’ machines. The sources aren’t changed often, so the cost is pretty low. What kind of hardware do they have? And what libraries you mean? It takes about 20 seconds on my Linux machine just to unpack 80 MB of Boost release archive. It's even worse on Windows, it takes several minutes for some strange reason even on SSD + Core i7. Using binaries in such cases is a huge time saver because there is no need to compile anything and there is no a lot of junk that they put into release archive (if you remove docs and tests 80 MB became 15 MB). I consider the time to download Boost isn’t part of the build. I'm not counting that. It takes 20 seconds just to unpack archive that already downloaded. Anyway I'm just wondering what is possible to do with Boost for 10 seconds. Install header-only libraries? Build 1-2 libraries? All of them? :) The Boost source download is cached outside of the build directory in a unique folder. So it’s effectively only done once for all platforms and then reused. This is true for local machines and for custom build servers like your personal Jenkins. For Travis/AppVeyor you have to create root folder with 3rd parties from scratch for each build (at least for free public accounts). You’ll also have symbols and the sources available for debugging properly, and they’re not always available with a binary distribution. Just to clarify: with Hunter you're creating binaries from sources, so everything you install with `cmake --build _builds --target install` (or `./b2 install` for Boost) is available. Of course building from source is not an option for such monsters like Qt or OpenCV. Ready-to-use binaries is something critical for real life applications. There is no way to test everything on Travis/AppVeyor without this feature. Well, you don’t have to use Travis or AppVeyor. It's the most powerful and easy to setup services I know. If you know any better solutions please share. Spotify isn’t at the same scale as most projects hosted there and we have different requirements and resources. Admittedly, Spotify doesn’t use Qt anymore, so this isn’t a problem for us. It's not about Qt, it's about expandability. Use 20 of smaller libraries and you will have quite the same issues. Note that by integrating everything in the same project, you also have proper dependencies and you will only build what you need. You may save some time by doing that. And caching is important, when done in the right way! With Hunter you're installing only what you need too, it does respect options like FOO_WITH_OPENSSL or FOO_WITH_QT, it download only binaries for toolchain you're currently working on, etc. Don't want to make a discussion too broad. You said that it's hard to manage binaries for a lot of configuration, I'm saying that it's possible and is very handy. Ruslo -- Powered by www.kitware.com Please keep messages on-topic and check the CMake FAQ at: http://www.cmake.org/Wiki/CMake_FAQ Kitware offers various services to support the CMake community. For more information on each offering, please visit: CMake Support: http://cmake.org/cmake/help/support.html CMake Consulting: http://cmake.org/cmake/help/consulting.html CMake Training Courses: http://cmake.org/cmake/help/training.html Visit other Kitware open-source projects at http://www.kitware.com/opensource/opensource.html Follow this link to subscribe/unsubscribe: http://public.kitware.com/mailman/listinfo/cmake
Re: [CMake] [cmake-developers] Need ideas/opinions on third party library management
On 16-Aug-16 13:52, Florent Castelli wrote: For example, Boost is used by 5 platforms: Windows, OSX, Linux, Android and iOS. Each platform has a different CPU target (or many 32/64bit, x86/ARM). Each platform has many compilers. Some platforms have instrumentation options (Debug / Release, ASan, MSan…) and really need to be compiled properly, otherwise you’ll end up with false positives. The matrix of builds is REALLY hard to track. Each time we update Boost, we had to update a lot of things. Not a problem for Hunter. Linux, OSX, Windows, iOS 9.3, iOS 8.2, Android, GCC, Clang, ASan, LeakSan, ThreadSan, Static Analyzer, libstdc++, libc++, MinGW, Visual Studio 2008-2015: * https://travis-ci.org/ingenue/hunter/builds/140317830 * https://ci.appveyor.com/project/ingenue/hunter/build/1.0.665 This list is not even full, I guess I can add more toolchains in future (GCC variations and C++ standards). To test all matrix I need to push one commit to pkg.boost branch, to upload binaries to server I need to push one commit to upload.boost branch (upload ALL toolchains at one shot). To reuse all updates users just need to set new URL/SHA1 of HunterGate module: https://github.com/ruslo/hunter/releases Overall, building boost takes 10s on our developers’ machines. The sources aren’t changed often, so the cost is pretty low. What kind of hardware do they have? And what libraries you mean? It takes about 20 seconds on my Linux machine just to unpack 80 MB of Boost release archive. It's even worse on Windows, it takes several minutes for some strange reason even on SSD + Core i7. Using binaries in such cases is a huge time saver because there is no need to compile anything and there is no a lot of junk that they put into release archive (if you remove docs and tests 80 MB became 15 MB). Of course building from source is not an option for such monsters like Qt or OpenCV. Ready-to-use binaries is something critical for real life applications. There is no way to test everything on Travis/AppVeyor without this feature. Ruslo -- Powered by www.kitware.com Please keep messages on-topic and check the CMake FAQ at: http://www.cmake.org/Wiki/CMake_FAQ Kitware offers various services to support the CMake community. For more information on each offering, please visit: CMake Support: http://cmake.org/cmake/help/support.html CMake Consulting: http://cmake.org/cmake/help/consulting.html CMake Training Courses: http://cmake.org/cmake/help/training.html Visit other Kitware open-source projects at http://www.kitware.com/opensource/opensource.html Follow this link to subscribe/unsubscribe: http://public.kitware.com/mailman/listinfo/cmake
Re: [CMake] [cmake-developers] Need ideas/opinions on third party library management
CMake builds for existing libraries are certainly an interesting and useful thing, and deserve to be posted in a GitHub repo somewhere. They should also serve as the basis of a campaign to get the library authors to incorporate the CMake build directly in their repos. But any approach that requires every build to be ported to CMake will be difficult and labor-prone to scale. Writing a meta-build recipe is usually much easier. Spack handles the combinatorial dependencies you mention in a sophisticated, graceful way that most meta-builders do not. Its only problem is it does not (yet) run on Windows. There's no fundamental reason why not; we just need someone to get involved and start trying it on Windows. -- Elizabeth On Tue, Aug 16, 2016 at 6:52 AM, Florent Castelli < florent.caste...@gmail.com> wrote: > At Spotify, we use CMake a lot for our large C++ library shared by all the > clients. > After trying to build libraries for each platform and variant, we > basically gave up and we now > use a super-build approach. > > For example, Boost is used by 5 platforms: Windows, OSX, Linux, Android > and iOS. > Each platform has a different CPU target (or many 32/64bit, x86/ARM). > Each platform has many compilers. > Some platforms have instrumentation options (Debug / Release, ASan, MSan…) > and really need > to be compiled properly, otherwise you’ll end up with false positives. > The matrix of builds is REALLY hard to track. Each time we update Boost, > we had to update > a lot of things. > I tried using ExternalProject and use b2 (build tool from Boost) to build > it and instead of having > lots of build jobs with a mirror of the flags, you end up mirroring the > flags in your CMake files > instead, which is still not good enough. > > In the end, I looked at how Boost is actually built. And for most > libraries, it’s plain simple. > A static library with a few files, some define, sometimes a platform > specific source file. > What if instead of having an external build tool, I built it from CMake > instead? > It would propagate all the build flags, target, instrumentation and > compiler information from the main > build to it and just work. > I tried it and it worked in no time! We replaced our Boost 1.59 binary > distribution with the source > distribution and it’s much easier. When people build our library for a > different target, they don’t have > to download new binaries, they just reuse the same sources. > Later on, we found a bug in Boost 1.59 (fixed in later versions) and > patched it. We updated our source > bundle and everything was smooth. > Much later on, we wanted to use 1.61. We just updated the source bundle > again, the list of source > files or compilation flags for the libraries we use didn’t change. It was > again effortless. > > Overall, building boost takes 10s on our developers’ machines. The sources > aren’t changed often, > so the cost is pretty low. It needs attention when we upgrade it, but > that’s quite rare. > > We try now to use the same approach for other libraries when we add them. > Some of them are > already using CMake and it’s somewhat easier, but since most people still > target version 2.8 (or 2.6...), > we find it better to rewrite the build scripts ourselves and use modern > features (as in, everything is > a target that propagates requirements, we don’t propagate variables). > It makes it also much easier to build a library for another platform that > wasn’t targeted by the original > project. > > If people are interested, I could share the CMakeLists.txt file we use for > Boost. It doesn’t build all > the libraries (some are hard like Context) and uses some internal macros, > but it should be plain > simple to tweak for your use. > > /Florent > > > On 12 Aug 2016, at 21:59, Robert Dailey> wrote: > > > > Hello, > > > > There is an internal C++ product at the company I work for which I > > have written a series of CMake scripts for. This project actually has > > dependencies on several open source libraries, such as boost, > > freetype, openssl, etc. > > > > Right now what we do is build each of these third party libraries *by > > hand*, once for every platform we support (Windows, Linux x86, Android > > NDK). Then we stuff the includes (headers) and libraries > > (static/shared) in a submodule and the primary code base's CMake > > scripts pull them in as interface targets. > > > > This works well and is light-weight but is a pain when upgrading or > > changing libraries. It's a pain because if I want to upgrade boost, I > > have to build it up to 6 times (once for each platform and once for > > each configuration). > > > > I've been thinking of a different approach for a while. I've done some > > toying around with the "Super Build" concept, where I have a separate > > CMake project that does nothing but use the ExternalProject module to > > build libraries in real time along with our project. So the order of > > operations would be
Re: [CMake] [cmake-developers] Need ideas/opinions on third party library management
Very interesting discussion, we have the same issues here. Florent Castelli, how many third parties libraries do you use ? I think a super build can be a very good solution but I'm wondering how much third party code you have to build. Here we use OpenCV, with, boost, and poco, and other things... So it may be too long. I was personnaly considering having an hybrid solution : include small libraries (like jsoncpp) and pre-build the other for each platforms. 2016-08-16 12:52 GMT+02:00 Florent Castelli: > At Spotify, we use CMake a lot for our large C++ library shared by all the > clients. > After trying to build libraries for each platform and variant, we > basically gave up and we now > use a super-build approach. > > For example, Boost is used by 5 platforms: Windows, OSX, Linux, Android > and iOS. > Each platform has a different CPU target (or many 32/64bit, x86/ARM). > Each platform has many compilers. > Some platforms have instrumentation options (Debug / Release, ASan, MSan…) > and really need > to be compiled properly, otherwise you’ll end up with false positives. > The matrix of builds is REALLY hard to track. Each time we update Boost, > we had to update > a lot of things. > I tried using ExternalProject and use b2 (build tool from Boost) to build > it and instead of having > lots of build jobs with a mirror of the flags, you end up mirroring the > flags in your CMake files > instead, which is still not good enough. > > In the end, I looked at how Boost is actually built. And for most > libraries, it’s plain simple. > A static library with a few files, some define, sometimes a platform > specific source file. > What if instead of having an external build tool, I built it from CMake > instead? > It would propagate all the build flags, target, instrumentation and > compiler information from the main > build to it and just work. > I tried it and it worked in no time! We replaced our Boost 1.59 binary > distribution with the source > distribution and it’s much easier. When people build our library for a > different target, they don’t have > to download new binaries, they just reuse the same sources. > Later on, we found a bug in Boost 1.59 (fixed in later versions) and > patched it. We updated our source > bundle and everything was smooth. > Much later on, we wanted to use 1.61. We just updated the source bundle > again, the list of source > files or compilation flags for the libraries we use didn’t change. It was > again effortless. > > Overall, building boost takes 10s on our developers’ machines. The sources > aren’t changed often, > so the cost is pretty low. It needs attention when we upgrade it, but > that’s quite rare. > > We try now to use the same approach for other libraries when we add them. > Some of them are > already using CMake and it’s somewhat easier, but since most people still > target version 2.8 (or 2.6...), > we find it better to rewrite the build scripts ourselves and use modern > features (as in, everything is > a target that propagates requirements, we don’t propagate variables). > It makes it also much easier to build a library for another platform that > wasn’t targeted by the original > project. > > If people are interested, I could share the CMakeLists.txt file we use for > Boost. It doesn’t build all > the libraries (some are hard like Context) and uses some internal macros, > but it should be plain > simple to tweak for your use. > > /Florent > > > On 12 Aug 2016, at 21:59, Robert Dailey > wrote: > > > > Hello, > > > > There is an internal C++ product at the company I work for which I > > have written a series of CMake scripts for. This project actually has > > dependencies on several open source libraries, such as boost, > > freetype, openssl, etc. > > > > Right now what we do is build each of these third party libraries *by > > hand*, once for every platform we support (Windows, Linux x86, Android > > NDK). Then we stuff the includes (headers) and libraries > > (static/shared) in a submodule and the primary code base's CMake > > scripts pull them in as interface targets. > > > > This works well and is light-weight but is a pain when upgrading or > > changing libraries. It's a pain because if I want to upgrade boost, I > > have to build it up to 6 times (once for each platform and once for > > each configuration). > > > > I've been thinking of a different approach for a while. I've done some > > toying around with the "Super Build" concept, where I have a separate > > CMake project that does nothing but use the ExternalProject module to > > build libraries in real time along with our project. So the order of > > operations would be as follows (for our automated build server): > > > > 1. Clone our "Third Party" repository > > 2. Use CMake to generate & build the "Super Build" project (this > > builds boost, openssl, freetype, etc for the current platform). > > 3. Clone the main code base's repository > > 4. Use