4
votes

I was able to setup integration between github and AWS CodePipeline, so now my code is uploaded to S3 after a push event by a lambda function. That works very well.

A new ZIP with source code on S3 trigger a pipeline, which builds the code. That's fine. Now I'd like to also build a docker image for the project.

The first problem is that you can't mix a project (nodejs) build and docker build. That's fine, makes sense. Next issue is that you can't have another buildspec.yml for the docker build. You have specify the build commands manually, ok, that works as a workaround.

The biggest problem though, or lack of my understanding, is how to put the docker build as part of the pipeline? First build step build the project, the the next build step builds the docker image. Two standalone AWS CodeBuilds.

The thing is that a pipeline build step have to produce an artifact on the output. But a docker build doesn't produce any files and it looks that the final docker push after docker build is not qualified as an artifact by the pipeline service.

Is there a way how to do it?

Thanks

2
Did you see this article? I'm going to be testing this out this week, but from the article you use the final step to push your docker image. Hope this helps. docs.aws.amazon.com/codebuild/latest/userguide/sample-docker.html - Greg Fennell
It works as a standalone build, that's ok. But if you wait it as a part of a CodePipeline, you can't, because a docker image doesn't produce any output artifacts. I guess I'll just put there something as the output, ugly workaround but I have not found any other way. - stibi
You could always just save a zip of the final package as the artifact, good way to have an offline build of the docker image. That was my plan. - Greg Fennell
You mean "docker save theimage:latest > the-image-latest.tar" ? Yeah, could be useful. - stibi
Yup, that's my plan. Will help keep revisions offline as well. Happy coding - Greg Fennell

2 Answers

4
votes

A bit late, but hopefully will be helpful for someone. You should have the docker image published as part of your post_build phase commands. Here's an example of a buildspec.yml:

version: 0.1

phases:
  pre_build:
    commands:
      - echo Logging in to Amazon ECR...
      - $(aws ecr get-login --region $AWS_REGION)
  build:
    commands:
      - echo Build started on `date`
      - echo Building the Docker image...          
      - docker build -t $IMAGE .
      - "docker tag $IMAGE $REPO/$IMAGE:${CODEBUILD_BUILD_ID##*:}"
  post_build:
    commands:
      - echo Build completed on `date`
      - echo Pushing the Docker image...
      - "docker push $REPO/$IMAGE:${CODEBUILD_BUILD_ID##*:}"
      - "echo {\\\"image\\\":\\\"$REPO/$IMAGE:${CODEBUILD_BUILD_ID##*:}\\\"} > image.json"
artifacts:
  files:
    - 'image.json'

As you can see, the CodeBuild project expects few parameters - AWS_REGION, REPO and IMAGE and publishes the image on AWS ECR (but you can use registry of your choice). It also uses the existing CODEBUILD_BUILD_ID environment variable to extract dynamic value for the image tag. After the image is pushed, it creates json file with the full path to the image and publishes it as an artifact for CodePipeline to use.

For this to work, the CodeBuild project "environment image" should be of type "docker" with the "priviledged" flag activated. When creating the CodeBuild project in your pipeline, you can also specify the environment variables that are used the buildspec file above.

There is a good tutorial on this topic here:

http://queirozf.com/entries/using-aws-codepipeline-to-automatically-build-and-deploy-your-app-stored-on-github-as-a-docker-based-beanstalk-application

1
votes

Sorry about the inconvenience. Making it less restrictive is in our roadmap. Meanwhile, in order to use CodeBuild action, you can use a dummy file as the output artifact.