0
votes

I have a pipeline job and two Maven jobs as shown below.

    node {
    def res
    stage('Build') {
        node('rhel6') {
            res = build job: "Build", parameters:
                                [
                                        string(name: 'jobname', value:'master'),
                                        string(name: 'val1', value: 'MyValue')
                                ]
        }
    }
    stage('Deploy') {
        node('rhel6') {
            build job: 'Deploy', parameters:
                    [
                            string(name: 'resName', value: "$res.buildVariables.filename")
                            string(name: 'firstVal', value: 'First_Argument')
                    ]
        }
    }
}

As you can see from my pipeline definition, I have a pipeline job and two jobs run under pipeline stages. The 'Build' job takes two string parameters and builds.

The Deploy job takes one input from Build job and builds.

1) Is this the right way of passing parameters between stages in jenkins pipeline? I'm using an approach similar to this.

2) How are the parameters mapped in the pipeline job to the parameters in the actual job? For Example: In pipeline job above, In 'Build' stage, I have jobname and val1 as parameters. How are these params mapped to actual params in the Build job?

3) How do i automate the generation of this pipeline job from job dsl scripts? I mean, how to generate the above pipeline itself in Jenkins?

1
Please ask one question at a time. Could you edit this question ,please.Jayan
The questions are connected. A person able to answer one, should be able to give some idea for the remaining two. Otherwise I have to duplicate the same code and questions thrice.SalmanKhan

1 Answers

2
votes

To answer your two first questions :

  1. Yes, this is (one of) the right way(s) to pass parameters
  2. The parameters you passed are simply interpreted as variables in your downstream job. In your Build Maven job, you can use just $jobname and $val1, like you would for any variables. By the way, you probably want to make sure that your variables names are clear enough so you don't have any ambiguity as for variables usage in your Maven job.

As for your third question, I'm not sure what you ask, could you please provide more information, a little context, etc. ?

On a general manner, I don't see the point of creating a pipeline that is just a wrapper for other jobs (here Build and Deploy). You should probably consider either :

  • Converting your Maven jobs to pipeline Groovy code so you can add it directly to your current pipeline job.

Something like this :

node('rhel6') {
    def res
    stage('Build') {
        // Your build step here
        sh "${env.mvnHome}/bin/mvn install"
    }
    stage('Deploy') {
        // Your deploy step here
        sh "${env.mvnHome}/bin/mvn deploy"
    }
}

Of course this is just an example, but you see the point...

  • Chaining your two Maven jobs, without creating a third job. You would do that by just adding a "Trigger parameterized build" at the end of your Build Maven job.

Unless I'm missing something here, it seems pointless to just create a wrapper that adds nothing valuable to your Build/Deploy jobs...