2
votes

I have a pipeline job which loads Jenkinsfile from git repository. My Jenkinsfile looks like this:

#!groovy
@Library('global-utils-lib') _
node("mvn") {
    stage('build') {
        checkout scm
    }

    stage('merge-request'){
        mergeRequest()
    }
}

global-utils-lib is shared library loaded in Global Pipeline Libraries from another git repo with following structure

vars/mergeRequest.groovy

mergeRequest.groovy:

def call() {
    sh "ip addr"
    def workspacePath = env.WORKSPACE
    new File(workspacePath + "/file.txt").text
}

Job is run against docker container (docker plugin).

When I run this job then docker container is provisioned correctly and scm is downloaded but I get FileNotFoundException. It looks like code from shared library is executed against jenkins master not slave:

  • presented IP comes from master
  • file is loaded correctly when I pass correct path to the scm on master

How can I run library code against slave? What I am missing?

1
Looks good to me, should work if you ask me, might be a bug in gpl... - Jon S

1 Answers

3
votes

It's generally not a good idea to try and do things like new File() instead of using existing Pipeline steps.

Your Pipeline script is interpreted and executed by the Jenkins master so, as you're seeing, the attempt to use the File API doesn't work as you might expect.

Sticking to Pipeline steps helps ensure that your pipeline is durable (i.e. survives restarts), is pausable, and doesn't block the execution thread, preventing parallel steps from working, for example.

In this case, the existing readFile step can be used.

I don't know how well the Docker Plugin interacts with Pipeline (though I imagine it should be transparent), and without knowing which agents have the "mvn" label, or whether you can reproduce this outside of a shared library, it's unclear why your sh step would appear to be running on the master.

The Docker Pipeline Plugin is explicitly designed for Pipeline, so it might give better results.