3
votes

I'm trying to run a job-dsl script from within a pipeline step. In general this should be possible, as described here, following code snippet within the pipeline step was added:

stage('Add new jobs') {
  steps {
    echo 'Scanning...'
    jobDsl(additionalClasspath: 'src/breuer/jenkins/utils', removedJobAction: 'DELETE', removedViewAction: 'DELETE',
       targets: 'src/breuer/jenkins/utils/DotNetJob.groovy', unstableOnDeprecation: true)
  }
}

When running this pipeline, Jenkins will complain with

ERROR: no Job DSL script(s) found at src/breuer/jenkins/utils/DotNetJob.groovy
Finished: FAILURE

The content of the DotNetJob.groovy looks like follows for testing purpose:

#!/usr/bin/env groovy
package breuer.jenkins.utils

import javaposse.jobdsl.dsl.Job

def solutions = findFiles glob: '**/*.sln'
echo "Solution count: ${solutions.size()}"

job("TestDotNet") {
  steps {
    shell 'echo Hello from new DotNet job'
  }
}

I think, the problem is, that the pipeline job and the script containing the job dsl are located in different workspaces. The setup is as follows:

  • 1 GitHub Organization
  • 2 Repositories within that organization
  • 1 Repo contains the shared libraries / job builders in groovy code
  • 1 Repo contains several .Net solutions and has the Jenkinsfile in the root

The shared library repo has been added as global pipeline library in Manage Jenkins -> Configure System and is implicitely loaded for every pipeline (e.g. the Jenkinsfile)

Now the pipeline in the actual code repo is very small. It just forwards to a pipeline definition within the shared libraries:

#!/usr/bin/env groovy

dotNetStandardPipeline {
  message = "Hello World!"
}

This works like charm, as the global pipeline library is imported implicetly. This dotNetStandardPipeline now contains the step noted above, trying to call the jobDsl pipeline step with a target being set to the DotNetJob.groovy script, located in the same repo as the dotNetStandardPipeline itself.
The problem now seems to be, that the pipeline is executed in the workspace of the 'Code-Repository' and therefore the path 'src/breuer/jenkins/utils' does not exist.

How can I know the real location of the script and how can I specify a jobDsl as target which itself lives in a different repo? Or am I maybe on the complete wrong track here?

EDIT

After some further investigation, it seems to be fact, that the shared library repository is checked out to a directory next to the "real" workspace with an @libs suffix. So I thought it would be a good idea, to use following approach:

script {
    def wsName = "${WORKSPACE}".split("\\\\")[ -1 ]
    echo "wsName: ${wsName}"
    echo "RelDir: ../${wsName}@libs/breuer-jenkins-lib/src/breuer/jenkins/utils/DotNetJob.groovy"
    jobDsl(removedJobAction: 'DELETE', removedViewAction: 'DELETE',
                    targets: "../${wsName}@libs/breuer-jenkins-lib/src/breuer/jenkins/utils/DotNetJob.groovy", unstableOnDeprecation: true)
}

Unfortunately this seems to break something entirely because now jenkins will complain with the following message in the output of the build:

java.nio.file.AccessDeniedException: D:\Road to Git\Jenkins\JenkinsGit\workspace\t_TestCIIntegration_develop-RKLAJXSET2S232SE6RNISESVW75KUNU4E3CPSAAP42MHZAGO6Z2A\.git
at sun.nio.fs.WindowsException.translateToIOException(Unknown Source)
at sun.nio.fs.WindowsException.rethrowAsIOException(Unknown Source)
at sun.nio.fs.WindowsException.rethrowAsIOException(Unknown Source)
at sun.nio.fs.WindowsFileSystemProvider.newByteChannel(Unknown Source)
at java.nio.file.Files.newByteChannel(Unknown Source)
at java.nio.file.Files.newByteChannel(Unknown Source)
at java.nio.file.spi.FileSystemProvider.newInputStream(Unknown Source)
at java.nio.file.Files.newInputStream(Unknown Source)
at hudson.FilePath.read(FilePath.java:1771)
at hudson.FilePath$read$8.call(Unknown Source)
at javaposse.jobdsl.plugin.ScriptRequestGenerator.readFile(ScriptRequestGenerator.groovy:103)

So it seems, even though I would be able to determine the location of the groovy file, it will still be impossible to invoke it?!

note: when copying the src directory into the workspace directly and setting the target parameter to src/breuer/jenkins... and so on, it works.

Does this mean, the groovy script has to be in the same repo as the jenkinsfile??

Edit 2

As it is quite tricky to explain in words the structure and the idea behind my plan, I've created a small demonstrator organization with two demo repositories on GitHub. here you can find the source code repo containing two C# solutions and the jenkinsfile. The readme describes the plan of the CI integration.
The CI library containing the groovy scripts is located here

Edit 3 and Conclusion

For most people coming here, please check the accepted answer given by mkobit (Thanks for the effort!). Especially the approach to solve the actual problem is really helpful. Putting the job-dsl script into the resources is definitely an option.

In the meantime I followed another approach and I would like to inform about this one.
I was already using a "GitHub Organization" job on jenkins. The aim was to have this be the only manually created job and have all other required jobs be created via code (i.e. via the Jenkinsfile).
One of the real repositories I have to take care about is a largely grown repo, moved from svn to git which contains about 300 .Net solutions. Each of these solutions shall be build by an individual job on jenkins. We could do this within the pipeline itself, but this would mean to ether have very many stages in the pipeline or to not have information about individually failing solutions at first glance. So, I have to dynamically build individual jobs for each solution.

The code repo itself should not be polluted with a lot of jenkins related stuff so I wanted to separate the two things strictly.
Now instead of having the pipeline calling the job-dsl script, I decided to create one further Freestyle-Job manually in jenkins. This serves as the Seed-Job and has some parameters (workspace, project, branch etc.).
Now the pipeline will trigger a build of the seed job, which then runs the job-dsl with the required information.
After this stage in the pipeline is finished, the pipeline will trigger a build of the required jobs afterwards.

This might not be the most elegant solution but with this approach, I reach a fully automated, defined in code jenkins environment with only two manually created jobs.

1
You said you added the shared library in the GUI and are loading it implicitly. Why are you targeting it with a relative path with jobDsl instead of using it directly with the jenkins-pipeline DSL?Matt Schuchard
As far as I understood, job-dsl and pipeline are somewhat different and cannot be mixed. For example within a pipeline I cannot do def myJob = job('jobName) You are only allowed to perform the step jobDslwhich would get the script text or a target to be executed...Tobias
You are correct about them being different "DSLs" - they operate in entirely different contexts, with the Job DSL plugin providing an entrypoint to run it through a pipeline. I am having a tough time figuring out where you want to run the Job DSL (jobDsl step), and with what content. Also, are you saying your jobDsl scripts are located in the same source code as the Global pipeline library?mkobit
@mkobit please see my edit 2. I've linked a demo organization on github to hopefully clarify my plan... I'm really in doubt if I'm following the right track here...Tobias

1 Answers

2
votes

The GitHub Branch Source plugin accomplishes a few things:

  • Scans one or multiple GitHub organizations
  • Generates a folder job for each repository
  • Each folder scans for notable things (pull requests, branches, etc.) that have a Jenkinsfile (with default configuration)
  • Each notable things has a pipeline job generated for it
  • Each pipeline job can automatically notify GitHub for build status (say on a pull request)
  • I am sure I am missing other notable features

It can operate with polling or by being listening to events, for example pull request creation, pull request updates, branches, and other SCM events.

I think the idea of having the Jenkinsfile for each repository generating jobs with a jobDsl step can be an over-complication (of course depends on your desired end goal). One of the benefits of Jenkins Pipelines is the ability to specify the build definition as code. In this example, you are defining additional jobs to build the project. Why not have the Jenkinsfile itself build the repositories, with help from the global libraries to define common paths? You already have job generation and scanning provided by the GitHub Branch Source plugin, so let the Jenkinsfile do the hard work of the build process.

Ok, if that wasn't convincing or I don't fully understand your use case, let's try and solve the problem you are running into.


There are a few different considerations and limitations that you have to consider in your approach

The jobDsl step can provide job scripts in a couple different ways:

  1. jobDsl(targets: 'ant/pattern/for/job/files/*.groovy') - files are provided from the workspace and the target can be an Ant pattern
  2. jobDsl(scriptText: "folder('myFolder')") - provide the script text directly

The Additional classpath option requires that the files also be in the workspace. It also has an additional requirement that the files be classfiles/JARs/things that work on the JVM classpath. This means you have to assemble the artifacts before using them with the Job DSL, which will be painful to get into jobs that consume your library.

Shared Libraries are loaded from source code, and then Jenkins Pipelines uses it's special compiler to prepare it for execution.

I can think of a few different options and details from these options:

  1. Don't use additionalClasspath in your global library - because it requires built classes (essentially) you will have to compile your helper classes
  2. Use jobDsl(scriptText: '<JOB_DSL_TEXT_DIRECTLY>') - you lose some of the ability to just test the Job DSL code, but you can still do an integration style test to see what jobs were created. (Plug incoming) I built a Gradle plugin that allows you to test shared libraries, so that may help with the testing aspect.
  3. If you want to keep your Job DSL Groovy script separate, you may have to put it in the resources directory in your shared library and still use the scriptText option. For example, if the script was at resources/breuer/jenkins/utils/DotNetJob.groovy and your Shared Library is loaded, you could do something like jobDsl(scriptText: libraryResource('resources/breuer/jenkins/utils/DotNetJob.groovy'))