I have been looking into using gitLab runner to push my code to the staging and prod environment automatically when there is a commit to the staging and master branches, but i am confused as to how to achieve this using a gitlab runner.
Here is my situation:
Project: Drupal website. Gitlab is used to stored everything but the settings and sites/default/files folder (where all the images and document uploaded by users are stored).
Gitlab CE and gitLab-runner are installed locally, they have access to my prod server, but not the other way around.
Since we are a very small web team, I am not planning on using automated testing for now, testing is manual. I plan to implement this in the future, when I'll have the time.
Here is the workflow that we have in place: develop in feature branch locally > test feature > merge feature to staging branch > deploy staging branch on staging server > test > merge to master > deploy to production server
Here are my main questions:
All the examples that I have seen with gitlab runner are using a docker executor. Since I am not running automated testing, do I really need to have the runner run a container and then deploy to prod? I was thinking that using a shell or ssh executer would most likely do the trick but i cannot find concrete examples of this scenario.
What is the correct approach: have gitlab-runner use a "push to deploy" where I would set my statgng and prod environments as remote repositories and execute git push to these repos or should the runner execute an rsync command to synch the files?
Once the code is deployed, should I use git hooks to execute post scripts such as compiling sass files and run sql scripts to update the db if my sites structure or configs need to be updated? Or should these be executed by the runner using ssh?
Other questions to help me understand the mechanics of all of this a bit better:
Before pushing or using rsync to deploy, does the runner fetches the gitLab repo locally or does it copy the files string from gitLab to the target environment?
Is the docker executor mostly used for run automated test in the CI portion of the DevOps workflow? Again, will the runner run a docker container, pull the files from gitlab in that container then deploy the files to the target environment from within the container?
In both scenarios, when pushing the code to the target environment, how can i ensure that files have the correct owner/group/permissions, especially if this is done from within a docker container that doesn't have the same users and groups than what I have on staging and prod?
Sorry for the long post, any help/advice would really be appreciated!