2
votes

I am trying to use gsutil to copy a file from GCS into a Run container during the build step.

The steps I have tried:

RUN pip install gsutil
RUN gsutil -m cp -r gs://BUCKET_NAME $APP_HOME/artefacts

The error:

ServiceException: 401 Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object.
CommandException: 1 file/object could not be transferred.
The command '/bin/sh -c gsutil -m cp -r gs://BUCKET_NAME $APP_HOME/artefacts' returned a non-zero code: 1
ERROR
ERROR: build step 0 "gcr.io/cloud-builders/docker" failed: step exited with non-zero status: 1

The service account (default compute & cloudbuild) does have access to GCS, and I have also tried to gsutil config -a and with various other flags with no success!

I am not sure on exactly how I should authenticate to successfully access the bucket.

2
In your previous question, you talked about github action. Is it still the case? If not, where are you building your container?guillaume blaquiere
Yes this needs to work locally and on GH actions.dendog

2 Answers

5
votes

Here my github action job

jobs:
  build:
    name: Build image
    runs-on: ubuntu-latest

    env:
      BRANCH: ${GITHUB_REF##*/}
      SERVICE_NAME: ${{ secrets.SERVICE_NAME }}
      PROJECT_ID: ${{ secrets.PROJECT_ID }}

    steps:
      - name: Checkout
        uses: actions/checkout@v2

      # Setup gcloud CLI
      - uses: google-github-actions/setup-gcloud@master
        with:
          service_account_key: ${{ secrets.SERVICE_ACCOUNT_KEY }}
          project_id: ${{ secrets.PROJECT_ID }}
          export_default_credentials: true

      # Download the file locally
      - name: Get_file
        run: |-
          gsutil cp gs://BUCKET_NAME/path/to/file .


      # Build docker image
      - name: Image_build
        run: |-
          docker build -t gcr.io/$PROJECT_ID/$SERVICE_NAME .

      # Configure docker to use the gcloud command-line tool as a credential helper
      - run: |
          gcloud auth configure-docker -q

      # Push image to Google Container Registry
      - name: Image_push
        run: |-
          docker push gcr.io/$PROJECT_ID/$SERVICE_NAME

You have to set 3 secrets:

  • SERVICE_ACCOUNT_KEY: which is your service account key file
  • SERVICE_NAME: the name of your container
  • PROJECT_ID: the project where to deploy your image

Because you download the file locally, the file is locally present in the Docker build. Then, simply COPY it in the docker file and do what you want with it.


UPDATE

If you want to do this in docker, you can achieve this like that

Dockerfile

FROM google/cloud-sdk:alpine as gcloud
WORKDIR /app
ARG KEY_FILE_CONTENT
RUN echo $KEY_FILE_CONTENT | gcloud auth activate-service-account --key-file=- \
  && gsutil cp gs://BUCKET_NAME/path/to/file .

....
FROM <FINAL LAYER>
COPY --from=gcloud /app/<myFile> .
....

The Docker build command

docker build --build-arg KEY_FILE_CONTENT="YOUR_KEY_FILE_CONTENT" \
  -t gcr.io/$PROJECT_ID/$SERVICE_NAME .

YOUR_KEY_FILE_CONTENT depends on your environment. Here some solution to inject it:

  • On Github Action: ${{ secrets.SERVICE_ACCOUNT_KEY }}
  • On your local environment: $(cat my_key.json)
2
votes

I see you tagged Cloud Build,

You can use step like this:

steps:
- name: gcr.io/cloud-builders/gsutil
  args: ['cp', 'gs://mybucket/results.zip', 'previous_results.zip']
# operations that use previous_results.zip and produce new_results.zip
- name: gcr.io/cloud-builders/gsutil
  args: ['cp', 'new_results.zip', 'gs://mybucket/results.zip']