1
votes

I have a multi container Docker application running on Elastic Beanstalk with CI/CD on CodePipeline. My application is split up in to multiple repositories and multiple images:

  • Frontend Repository/Image/Pipeline
  • Backend Repository/Image/Pipeline
  • Deployment Repository (Dockerrun.aws.json, nginx/conf.d/ etc)

The only thing I deploy to Elastic Beanstalk is my deployment repository. The Frontend and Backend repositories have no information about the Dockerrun.aws.json file, the environment variables etc.

My issue is, that in the latest step of the CodePipeline for both my Frontend and Backend repositories, it tries to push it to Elastic Beanstalk and fails because no Dockerrun.aws.json file is included in the output artifacts.

So what I want to do is, the Pipeline should build the image, push it to ECR (Which I already do successfully) and then just trigger the Elastic Beanstalk to update/pull down the images. How do I do this? I don't want to push the artifact from the build to Elastic Beanstalk.

1

1 Answers

1
votes

It seems like Elastic Beanstalk is meant to use a single "mono repo" and not run be a multi repo application. Therefore instead of using multiple repositories I merged them into a single one.

My current repository now looks like this:

.git/
backend_app/
frontend_app/
Dockerrun.aws.json

Now whenever I push to my repository, AWS CodePipeline picks up the changes and can successfully deploy it to my Elastic Beanstalk application.

For anyone who want to run an application using microservices, multi repo's or similar, I guess Elastic Beanstalk might not be the right approach.