1
votes

I'm trying to generate Jenkins pipelines using the pipelineJob function in the jobDSL pluging, but cannot pass parameters from the DSL to the pipeline script. I have several projects that use what is essentially the same Jenkinsfile, with differences only in a few steps. I'm trying to use the JobDSL plugin to generate these pipelines on the fly, with the values I want changed in them interpreted to match the parameters to the DSL.

I've tried just about every combination of string interpretation that I can in the pipeline script, as well as in the DSL, but cannot get Jenkins/groovy to interpret variables in the pipeline script.

I'm calling the job DSL in a pipeline step:

def projectName = "myProject"
def envs = ['DEV','QA','UAT']
def repositoryURL = 'myrepo.com'

jobDsl targets: ['jobs/*.groovy'].join('\n'), 
    additionalParameters: [
        project: projectName, 
        environments: envs, 
        repository: repositoryURL
    ],
    removedJobAction: 'DELETE',
    removedViewAction: 'DELETE'

The DSL is as follows:

pipelineJob("${project} pipeline") {
    displayName('Pipeline')
        definition {
            cps { 
                script(readFileFromWorkspace(pipeline.groovy))
            }
        }
    }

pipeline.groovy:

pipeline {
    agent any

    environment {
        REPO = repository
    }

    parameters {
        choice name: "ENVIRONMENT", choices: environments
    }

    stages {
        stage('Deploy') {
            steps {
                echo "Deploying ${env.REPO} to ${params.ENVIRONMENT}..."
            }
        }
    }
}

The variables that I pass in additionalParameters are interpreted in the jobDSL script; a pipeline with the correct name does get generated. The problem is that the variables are not passed to the pipeline script read from the workspace - the Jenkins configuration for the generated pipeline looks exactly the same as the file, without any interpretation on the variables.

I've made a number of attempts at getting the string to interpret, including a lot of variations of "${environments}", ${environments}, $environments, \$environments...I can't find any that work. I've also tried reading the file as a gstringImpl:

script("${readFileFromWorkspace(pipeline.groovy)}")

Does anyone have any ideas as to how I can make variables propagate down to the pipeline script? I know that I could just use a for loop to do string.replaceAll() on the script text, but that seems cumbersome; there's got to be a better way.

3
The first thing I notice is you are attempting to access parameter values outside of the params map. Try changing to params.ENVIRONMENT and see how far that gets you. Additionally you are implicitly accessing repo outside of the env map, which will be inconsistent for you. - Matt Schuchard
You were right about accessing parameters outside of the params map - making that change does improve things. The big issue is still there, though; I want the line choice name: 'ENVIRONMENT' choices: environments to be interpreted as choice name: 'ENVIRONMENT' choices: ['DEV', 'QA', 'UAT'] based on the value I'm passing to the job DSL. - Scott Miller
I'm having a similar issue with parameters too. I had to create the parameters at the DSL job description and then again on the pipeline. I think that cps { script() } is not interpreted, I think Jenkins just creates the job with the script string as steps and that's it. - Spidey

3 Answers

2
votes

I've come up with a way to make this work. It's not what I'd prefer, which is having the string contents of the file implicitly interpreted during job creation, but it does work; it just adds an extra step.

import groovy.text.SimpleTemplateEngine

def fileContents = readFileFromWorkspace "pipeline.groovy"

def engine = new SimpleTemplateEngine()
template = engine.createTemplate(fileContents).make(binding.getVariables()).toString()

pipelineJob("${project} pipeline") {
    displayName('Pipeline')
    definition {
        cps { 
            script(template)
        }
    }
}

This reads a file from your workspace, then uses it as a template with the binding variables. The other changes needed to make this work are escaping any variables used in your Jenkinsfile script, like \${VARIABLE} so that they are expanded at runtime, not at the time you build the job. Any variables you want to be expanded at job creation should be referenced as ${VARIABLE}.

1
votes

You need to use the complete job name as a variable without the quotes. E.g., if JOBNAME is a parameter containing the entire job name:

pipelineJob(JOBNAME) {
    displayName('Pipeline')
        definition {
            cps { 
                script(readFileFromWorkspace(pipeline.groovy))
            }
        }
    }
0
votes

You could achieve what you're trying to do by defining environment variables in the pipelineJob and then using those variables in your pipeline.

They are a bit limited because environment variables are strings, but it should work for basic stuff

Ex.:

//job-dsl
pipelineJob('example') {
    environmentVariables {
        // these vars could be specified by parameters of this job
        env('repository', 'blah')
        env('environments', "a,b,c"]) //comma separated string
    }
    displayName('Pipeline')
        definition {
            cps { 
                script(readFileFromWorkspace(pipeline.groovy))
            }
        }
    }
}

And then in the pipeline:

//pipeline.groovy
pipeline {
    agent any

    environment {
        REPO = env.repository
    }

    parameters {
        choice name: "ENVIRONMENT", choices: env.environments.split(',') 
        //note the need to split the comma separated string above
    }
}