Last Updated: February 25, 2016
·
12.12K
· daniperez

Use CMake-enabled libraries in your CMake project (II)


Summary

In this protip I analyze the drawbacks of using external projects to implement multi-library projects.

Note: A previous article (I) and a follow-up to this one (III) exist.


In the previous article on this subject I showed a way of implementing multi-library projects by means of ExternalProject_Add and CMake export registry. The solution allowed us to include third-party library sources as if they were installed in the system (i.e. with find_package). The solutions comes no free of drawbacks. After using this implementation for a while, I found some disappointing effects.

Drawbacks

The B and C sub-libraries (see the previous article) are built as external projects. External projects are configured at execution time. Indeed, CMake workflow consists roughly of 2 steps: the configuration time and the execution time. Configuration time is when cmake processes our CMakeLists.txt. Execution time is the time we run make or whatever the compilation tool we chose. As a result of that, what is explained in my previous article won't work as expected:

  • Cannot reference targets in the sub-libraries: this is something we already knew and it's a benefit/drawback of using external projects. We are obliged to use configuration files, which leads me to my next point. <br>
  • Configuration files (*-config.cmake) of B and C are not written as A is being configured: indeed, the fact that external projects are configured at execution time, makes that configuration files of B and C won't be written at A's configuration time, resulting in find_package(A) and find_package(B) not working. <br>

Solution

Assuming the 1st problem cannot be solved, in order to solve the second, we can do two things (both commented in this post): one solution is to implement A yet as another external project, sometimes referred as the "superbuild" solution since our top-most CMakeLists.txt doesn't do anything but orchestrating several external projects, including our A root project:

CMakeLists.txt // "superbuild" make list
B/
  CMakeLists.txt
C/
  CMakeLists.txt  // depends on B
A/
  CMakeLists.txt  // depends on B and C

We have only to ensure, in the root CMakeLists.txt, that A is compiled after C, and C in turn after B:

ExternalProject_Add (  B ... )
ExternalProject_Add (  C ...  DEPENDS B )
ExternalProject_Add (  A ...  DEPENDS B C )

Another solution is to keep A as the root project and make use of imported libraries, provided we know where B and C are created. ExternalProject_Get_Property give us some clues but eventually we'll have to hard-code the path to them. This solution is also depicted in the previous post and here. I don't like the solution that much since it's a bit hacky, but I must admit it's simple.

Yeah... a bit disappointing. How do you deal with multi-library projects? Let me know in the comments! I hope this protip was useful!

5 Responses
Add your response

I've got to quite the same solution, using hard coded paths given by ExternalProject_Get_Property but I agree, it's not very nice. I'll check the imported libraries stuff you mentionned.

The problem when doing this way with Git is the large amount of "useless" configuration/orchestration/super-build repositories including the others as submodules and the fact that children repositories won't be standalone anymore…
Here, for instance, you'll need A.git, B.git, C.git, CBuild.git, ABuild.git and the presented use case is quite simple!

When you do not control your server repositories creation, it can be a pain.
For your users/new comers, it requires extra wiki documentation to explain this fact.

For GTest which is built using CMake, I use ExternalProject since I do not modify its sources when dealing with my depending code, but for project needing to modify both sides, I procede with submodules and a configuration top level repository.

The ideal solution here is to deal with already built artifact stored in an artifact repository. This is the preferred and effective way used in Java world with Ivy or Maven.
The difficulty in the C/C++ world is to keep information on compilation options which are very important for depending code.

over 1 year ago ·

I totally agree with you. Too much boilerplate for something that should be easier. The thing I don't understand is the export(TARGETS) and install(EXPORT) asymmetry. I would prefer a single way of doing things. I wouldn't even mind if it's not flexible and maybe it doesn't cope with 100% of scenarii as long as it copes with a good majority. And using other libraries not installed in the system is one of them in my honest opinion.

I understand your pain if you, in addition to that mess, have to manage the git submodules as well :-) In my case, I let ExternalProject_Add download them for me, I found that easier than having to deal with git submodules.

About your last paragraph, I think every C++ programmer (including myself) has at least once wanted to have something like Maven for dependencies. I understand it's the C++'s Unicorn, but I think it can be feasible under some assumptions.

I find also very interesting how development will change as Docker evolves and improves. With Docker you set the Linux distro beforehand, so no more configuration step or multi-platform madness. And the dependencies are managed by the distribution (e.g. yum in Fedora). Looking forward to it :-)

I'll check GTest's build! Thanks!

over 1 year ago ·

I'm currently improving my understanding of CMake and C++ (I come from the Java/Ivy world!), I had boostraped a toy project to test and learn on good practices of CMake and C++ library definition. You will find how I deal with Google Test there: https://github.com/opatry/cpp-cmake-template/blob/master/test

For the artifact management flow in C++, there are certainly something to do, but, according to my searches, it's not straightforward. As mentionned previously, the compilation options and defines can be a pain to maintain.
Moreover the split between library binary and library headers is not as simple as Java unique Jar files with extra META-INF available if needed.

There are some stuff around that, like NuGet, but seems very Microsoft oriented (combined with CoApp?).

I found another very interesting one but can't remember its name (something like rydle?), and seems dead project if I remember.

over 1 year ago ·

thanks for the tips! I've been using CMake for a while, only now I was trying to push it a bit further than usual. You can find my last experiments in the 'cmake-way' branch of my 'ortul' package. It's a project I'm working on which depends on Google's OR-TOOL and lazylpsolver.

the project is in its very early stages, I was only playing with packaging.

over 1 year ago ·

(Hopefully) the last article of these series written:
https://coderwall.com/p/qej45g

over 1 year ago ·