2
votes

Background

I am using Jenkins with the Build Pipeline plugin to build some fairly complicated projects that require multiple compilation steps:

  1. Build source RPM.
  2. Build binary RPM (this is performed twice, once for each platform).
  3. Deploy to YUM repository.

My strategy for solving build requirements involves splitting the common work into parameterized jobs that can be reused across projects and branches, with each job representing one stage in the pipeline. Each stage is triggered with parameters, and build artifacts passed along to the next job in the pipeline. However, I'm having some trouble with this strategy, and could really use some tips on how to go about solving this problem in the most elegant and flexible way possible.

To be more specific, there are two common libraries, which are shared by other projects (but not all projects). The libraries are built differently from the dependent projects, but it should not be necessary to specify in Jenkins what the dependent projects are.

There are multiple branches, the master branch (rebuilt nightly), the develop branch (polled for changes), feature branches (also polled), and release branches (polled, but built for release). The branches are built the same way across multiple projects.

We create multiple repositories every month, and whilst it is feasible to expect a little setup for a new project, generally we want this to be as simple and automated as possible.

The Problems

  1. I have many projects with multiple branches, and I do not wish to build all branches or even all projects in the same way. Because most of the build steps are similar I can turn these common steps into parameterized build jobs, and get each job to trigger the next in the chain, passing parameters and build artifacts along the chain. However, this falls apart if one of the steps needs to be skipped, because I don't know of a way to conditionally skip a build step. This implies I would need to copy the build jobs so that I can customise them for each pipeline, resulting in a very large number of build jobs. I could use a combination of plugins to create a job generator (eg. dsl flow, dsl job, etc), and hide as much as possible from the users, but what's the most elegant Jenkins solution to this? Are there any plugins, or examples that I might have missed? What's your experience of doing this?

  2. Because step 2 can be split into two jobs that can be run in parallel, this introduces a complexity that is causing me problems with my pipeline. My first attempt would trigger a parameterized build job twice with different parameters, and then join the jobs afterwards using the join plugin, but it was beginning to look like it would be complicated to copy in the build artifacts from the two upstream jobs. This is significant, because I need the build artifacts from both jobs for stage 3. What's the most elegant solution to join parallel jobs and copy artifacts from them all? Are there any examples that I might have missed?

  3. I need to combine test results generated from both of the jobs in stage 2, and copy them to the job that triggers the build. What's the best way to handle this?

I'm happy to read articles, presentations, technical articles, reference documentation, write scripts and whatever else necessary to make this work nicely, but I'm not a Jenkins expert. If anyone can give me some advice on these 3 problems then that would be helpful. Additionally, I would appreciate any constructive advice on how to get the best out of pipeline CI builds in Jenkins, if relevant.

1

1 Answers

1
votes

For the first point, the Job Generator plugin I wrote has been developed to address this use case. You can find more info on the wiki page of Job Generator. There is also the same type of plugin with a different approach (job generator as a build step), it is called Jobcopy Builder. The other approaches you mentioned require some kind of DSL and can be a good choice too.