0
votes

I'm writing a groovy script with a job to deploy terraform. I'm using Job DSL and have the seed job being implemented by JCasC, all works fine. Then I have a repo with groovy files in containing the jobs.

If I keep the groovy file really simple as a single job, it works fine.

However, I want to be able to build a pipeline with build stages. I know I can write the pipeline in a Jenkinsfile & then call that jenkins file from Job DSL.. but ideally I would like to keep the entire pipeline in the groovy file for simplicity.

I have this as a starter:

pipelineJob('Deploy-K8s-Cluster') {
  definition {
    cps {
      script('''
        pipeline {
          agent any
            stages {
              stage('Checkout'){
                steps {
                  scm {
                    git {
                      branch('master')
                      remote {
                        url '[email protected]:jjbbtt/aws-infrastructure.git'
                        credentials('bitbucket-ssh')
                      }
                    }
                  }
                }
              }
              stage('Terraform Initialize') {
                steps {
                  'terraform init'
                }
              }
              stage('Terraform Plan') {
                steps{
                  'terraform plan -out=create.tfplan'
                }
              }
              stage('Terraform Apply') {
                steps{
                  'terraform apply -auto-approve create.tfplan'
                }
              }
              stage('Deploy Sealed Secrets Controller') {
                steps{
                  'kubectl apply -f sealed-secrets/controller.yaml'
                }
              }
            }
        }
      '''.stripIndent())
      sandbox()
    }
  }
}

However, I'm seeing this error:

General error during semantic analysis: There's no @DataBoundConstructor on any constructor of class javaposse.jobdsl.plugin.casc.FromUrlScriptSource

I've tried various ways, and read a bunch of docs.. but the issue is I'm not super familiar with Job DSL.

Am I missing something simple? Or barking up the wrong tree entirely?

1
I've checked the pipeline on the destination job.. it's present as per the above script. So perhaps this is not an issue with Job DSL at all, but a problem with the pipeline.jonnybinthemix
I think I'm mixing Job DSL syntax with Puppetfile syntax - Perhaps the issue is that I'm using Job DSL syntax for the git checkout in the actual pipeline.jonnybinthemix

1 Answers

0
votes

I figured it out.. my misunderstanding was not seeing that anything inside the script() section of cps was not Job DSL. Works if I change the syntax.