I'm converting a Jenkins job which has been configured in the UI to one that is configured using a declarative pipeline script.
It's a maven built Java project with a post-build action that deploys to Artifactory
The build and test step is straightforward. we can take this UI
And convert it to a mvn command sh "mvn clean install"
.
The config for the post build step is simple in the UI
This has three text boxes checked
- deploy maven artifacts
- filter excluded artifacts from build info (I don't think there are any though)
- capture and publish build info
This generates and uploads a rich buildInfo.json and uploads the correct artifacts to our artifactory server.
I want to replace this with a pipeline step... reading this documentation https://www.jfrog.com/confluence/display/RTF/Declarative+Pipeline+Syntax suggests that building maven jobs should be done by having the artifactory plugin run the maven commands. I'd like to avoid this because we don't do that currently and I'd eventually like to remove the need to store artifacts for this project on the server.
In the pom we would see
<groupId>some.value</groupId>
<artifactId>my-app</artifactId>
<version>1.1.184</version>
I've got to a partially working script:
stage('publish common code to artifactory') {
steps {
rtUpload (
serverId: 'Artifactory',
spec: '''{
"files": [
{
"pattern": "applications/my-app/pom.xml",
"target": "libs-release-local"
}
]
}''',
buildName: 'my-app',
buildNumber: env.GIT_HASH_VERSION
)
rtPublishBuildInfo (
serverId: "Artifactory",
buildName: 'my-app',
buildNumber: env.GIT_HASH_VERSION
)
}
}
This publishes the pom to artifactory but where originally it would publish to some/value/my-app/my-app-1.1.184.pom
my pipeline version publishes to the root as /pom.xml
Artifactory looks capable of reading info from the pom and using it to correctly place the artifacts without me having to write code but how?