0
votes

I’m trying to create a docker container that will execute a BigQuery query. I started with the Google provided image that had gcloud already and I add my bash script that has my query. I'm passing my service account key as an environment file.

Dockerfile

FROM gcr.io/google.com/cloudsdktool/cloud-sdk:latest
COPY main.sh main.sh

main.sh

gcloud auth activate-service-account [email protected] --key-file=/etc/secrets/service_account_key.json
bq query --use_legacy_sql=false

The gcloud command successfully authenticates but can't save to /.config/gcloud saying it is read-only. I've tried modifying that folders permissions during build and struggling to get it right.

Is this the right approach or is there a better way? If this is the right approach, how can I get ensure gcloud can write to the necessary folder?

1

1 Answers

1
votes

See the example at the bottom of the Usage section.

You ought to be able to combine this into a single docker run command:

KEY="service_account_key.json"
echo "
[auth]
credential_file_override = /certs/${KEY}
" > ${PWD}/config

docker run \
--detach \
-env=CLOUDSDK_CONFIG=/config \
--volume=${PWD}/config:/config \
--volume=/etc/secrets/${KEY}:/certs/${KEY} \
gcr.io/google.com/cloudsdktool/cloud-sdk:latest \
  bq query \
  --use_legacy_sql=false

Where:

  • --env set the container's value for CLOUDSDK_CONFIG which depends on the first --volume flag which maps the host's config that we created in ${PWD} to the container's /config.
  • The second --volume flag maps the host's /etc/secrets/${KEY} (per your question) to the container's /certs/${KEY}. Change as you wish.
  • Suitably configured (🤞), you can run bq

I've not tried this but that should work :-)