2
votes

we need to automate the process of deployment. Let me point out the stack we use. We have our own GitLab CE instance and private docker registry. On production server, application is run in container. After every master commit, GitLab CI builds the image with code in it, sends it to docker registry and this is where automation ends.

Deployment on production server could be performed by a few steps - stopping current application container, pulling newer one and run it.

What is the best way to automate this process?

I read about a couple of solutions (but I believe there is much more)

  • docker private registry pings to a production server that does all the above steps itself (script on production machine managed by eg. supervisor or something similar)
  • using docker machine to remotely manage run containers

What is the preferred way? Or you can recommend something else?

No need to use tools like swarm, kubernetes, etc. It's quite simple application. Thanks in advance.

2

2 Answers

2
votes

How about install Gitlab-ci runner on your production machine? And perform a job after the push to registry on master called deploy and pin it to that machine using Gitlab CI tags.

The job simply pulls the image from the registry and restarts your service or whatever you have in place.

Something like:

deploy-job:
  stage: deploy
  tags:
    - production
  script:
    - docker login myprivateregistry.com -u $SECRET_USER -p $SECRET_PASS
    - docker pull $CI_REGISTRY_IMAGE:latest
    - docker-compose down
    - docker-compose up -d
2
votes

I can think of four solutions

  1. use watchtower on production server https://github.com/v2tec/watchtower
  2. run a webhook server which is requests by your CI after pushing the image to the registry. https://github.com/adnanh/webhook
  3. as already mentioned, run the CI on production too which finaly triggers your update commands.
  4. enable docker api and update the container by requesting it from the CI