I'm writing a groovy script with a job to deploy terraform. I'm using Job DSL and have the seed job being implemented by JCasC, all works fine. Then I have a repo with groovy files in containing the jobs.
If I keep the groovy file really simple as a single job, it works fine.
However, I want to be able to build a pipeline with build stages. I know I can write the pipeline in a Jenkinsfile & then call that jenkins file from Job DSL.. but ideally I would like to keep the entire pipeline in the groovy file for simplicity.
I have this as a starter:
pipelineJob('Deploy-K8s-Cluster') {
definition {
cps {
script('''
pipeline {
agent any
stages {
stage('Checkout'){
steps {
scm {
git {
branch('master')
remote {
url '[email protected]:jjbbtt/aws-infrastructure.git'
credentials('bitbucket-ssh')
}
}
}
}
}
stage('Terraform Initialize') {
steps {
'terraform init'
}
}
stage('Terraform Plan') {
steps{
'terraform plan -out=create.tfplan'
}
}
stage('Terraform Apply') {
steps{
'terraform apply -auto-approve create.tfplan'
}
}
stage('Deploy Sealed Secrets Controller') {
steps{
'kubectl apply -f sealed-secrets/controller.yaml'
}
}
}
}
'''.stripIndent())
sandbox()
}
}
}
However, I'm seeing this error:
General error during semantic analysis: There's no @DataBoundConstructor on any constructor of class javaposse.jobdsl.plugin.casc.FromUrlScriptSource
I've tried various ways, and read a bunch of docs.. but the issue is I'm not super familiar with Job DSL.
Am I missing something simple? Or barking up the wrong tree entirely?