I have a Dataflow job that is written using Apache Beam. It looks similar to this template, but it saves data from JDBC to Cloud Storage:
My problem was, that everybody could see database credentials in Dataflow UI. So I found article
where community show how to encrypt this data. I did everything like in this article, but my Dataflow job doesn't want to decrypt credentials with KMS key given (when I run it using Cloud Function).
So I tried running it in Cloud Shell
gcloud dataflow jobs run JOB_NAME \
--region=us-west1 \
--gcs-location=TEMPLATE_LOCATION \
--dataflow-kms-key=projects/PROJECT_ID/locations/us-west1/keyRings/KEY_RING/cryptoKeys/KEY_NAME \
--parameters=...,KMSEncryptionKey=projects/PROJECT_ID/locations/us-west1/keyRings/KEY_RING/cryptoKeys/KEY_NAME,...
But I have an error
Error message from worker: java.lang.RuntimeException: org.apache.beam.sdk.util.UserCodeException: com.google.api.gax.rpc.PermissionDeniedException: io.grpc.StatusRuntimeException: PERMISSION_DENIED: Permission 'cloudkms.cryptoKeyVersions.useToDecrypt' denied on resource 'projects/PROJECT_ID/locations/us-west1/keyRings/KEY_RING/cryptoKeys/KEY_NAME' (or it may not exist).
I am completely stuck. Has anyone had the same problem and could help?