0
votes

I have been looking into using gitLab runner to push my code to the staging and prod environment automatically when there is a commit to the staging and master branches, but i am confused as to how to achieve this using a gitlab runner.

Here is my situation:

  • Project: Drupal website. Gitlab is used to stored everything but the settings and sites/default/files folder (where all the images and document uploaded by users are stored).

  • Gitlab CE and gitLab-runner are installed locally, they have access to my prod server, but not the other way around.

  • Since we are a very small web team, I am not planning on using automated testing for now, testing is manual. I plan to implement this in the future, when I'll have the time.

  • Here is the workflow that we have in place: develop in feature branch locally > test feature > merge feature to staging branch > deploy staging branch on staging server > test > merge to master > deploy to production server

Here are my main questions:

  1. All the examples that I have seen with gitlab runner are using a docker executor. Since I am not running automated testing, do I really need to have the runner run a container and then deploy to prod? I was thinking that using a shell or ssh executer would most likely do the trick but i cannot find concrete examples of this scenario.

  2. What is the correct approach: have gitlab-runner use a "push to deploy" where I would set my statgng and prod environments as remote repositories and execute git push to these repos or should the runner execute an rsync command to synch the files?

  3. Once the code is deployed, should I use git hooks to execute post scripts such as compiling sass files and run sql scripts to update the db if my sites structure or configs need to be updated? Or should these be executed by the runner using ssh?

Other questions to help me understand the mechanics of all of this a bit better:

  1. Before pushing or using rsync to deploy, does the runner fetches the gitLab repo locally or does it copy the files string from gitLab to the target environment?

  2. Is the docker executor mostly used for run automated test in the CI portion of the DevOps workflow? Again, will the runner run a docker container, pull the files from gitlab in that container then deploy the files to the target environment from within the container?

  3. In both scenarios, when pushing the code to the target environment, how can i ensure that files have the correct owner/group/permissions, especially if this is done from within a docker container that doesn't have the same users and groups than what I have on staging and prod?

Sorry for the long post, any help/advice would really be appreciated!

1

1 Answers

0
votes

Just answering your questions:

1- Docker executor is best practice, in theory you keep your environment clean between builds and do not need to install and remove files from the host machine. In practice gitlab maintains containers to acelerate build times. I prefer using docker, you can pack all your dependencies for build on it.

4- GitLab fetches the repo locally on the runner, you can see more details on the build log console.

5- yes. I did not get the meaning of deploy. The gitlab runner will start a container with the code fetched on the local folder. The deploy only occurs if you add shell commands to do this task.

6- I guess it keeps permissions, but it is better to check it on build execution and check on the target server if the permissions are correct.

I did not understand your deploy process. Do you transfer files for execution? I would recommend to pack it on a docker image and run it. Then you would have same env on staging and production.