1
votes

I have a large GPR based project that can take over 30 minutes to compile.

Having analyzed the build process I noticed many obvious inefficiencies (multiple calls to gprbuild rather than aggregates, excessive use of alternative files rather than configurations, etc). I am wondering if there is some means to 'profile' the build process to see what takes so long.

In particular it takes about 5 minutes to recompile when even a single file changes and there is an error in it. In theory it should be pretty quick to realize that that file has to be recompiled (its the only one that does) and start the compilation process, rapidly discovering the error.

From the verbose output it looks like it takes quite a while just parsing the massive web of gpr files used to define the build, but I would like to know where it spends most of its time.

Thus my question is: Is it possible to profile a build done by gprbuild? If so, how?

1
It is my experience that gprbuild has a hard time managing lots of source files with individually defined switches. I never cared enough to check if the problem was specific to defining switches on a per source file basis, or if it was plain project file size that was the problem.Jacob Sparre Andersen

1 Answers

5
votes

From low to high complexity:

  • Ask gprbuild to report more details about what it is doing with the flag -vh.
  • Run gprbuild through strace.
  • Rebuild gprbuild with the required flags to profile it using gprof (but be aware that gprof doesn't always tell the truth).