I have a huge project, something about 150 000 LOC of C++ code. Build time is something about 15 minutes. This project consists of many sub-projects of different sizes.
I have built separate precompiled headers for each subproject, but when I use them build time stays roughly the same. It seemed that build time is 5-10% percent less, not more.
Precompiled headers is definitely used, I use -Winvalid-pch
option and I have tried to compile with -H
compiler option, my precompiled headers appears in output with 'bang' symbol, that means that compiler is able to use precompiled header.
All my precompiled headers is not very large, every file is something about 50Mb. I use python script, found here to generate list of most used precompiled headers so my list of precompilation candidates is quite good.
Is there any free/open source tools for build optimization? It seemed that standard make
utility doesn't have ability to measure build times of different targets. I can't find the way to get the statistics for different targets with make
. I'm not talking about dependency analysis or something advanced. I just want to know for what targets most of the time was wasted.
Also, it seemed that GCC is quite inefficient in dealing with precompiled headers. I was unable to get any subproject build notably faster, maximum speedup that I get is 20% on a project that gets three minutes to build. It seemed that it is easier and cheaper to buy faster machine with solid state drive than to optimize build time on linux with GCC.