I have created a DAG that exports MySQL data to a Google Cloud Storage Bucket.
I got the following error that I don't have access to create the .json file from airflow.
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/googleapiclient/discovery_cache/init.py", line 41, in autodetect from . import file_cache
File "/usr/local/lib/python2.7/dist-packages/googleapiclient/discovery_cache/file_cache.py", line 41, in 'file_cache is unavailable when using oauth2client >= 4.0.0')
ImportError: file_cache is unavailable when using oauth2client >= 4.0.0 [2018-01-10 16:18:52,584] {discovery.py:274} INFO - URL being requested: GET https://www.googleapis.com/discovery/v1/apis/storage/v1/rest
[2018-01-10 16:18:52,585] {transport.py:157} INFO - Attempting refresh to obtain initial access_token
[2018-01-10 16:18:52,661] {client.py:777} INFO - Refreshing access_token
[2018-01-10 16:18:53,698] {_helpers.py:132} WARNING - init() takes at most 2 positional arguments (3 given)
[2018-01-10 16:18:53,712] {discovery.py:872} INFO - URL being requested: POST https://www.googleapis.com/upload/storage/v1/b/podio-reader-storage/o?uploadType=media&alt=json&name=phones_schema.json
[2018-01-10 16:18:54,936] {http.py:120} WARNING - Encountered 403 Forbidden with reason "forbidden"
[2018-01-10 16:18:54,936] {models.py:1417} ERROR - https://www.googleapis.com/upload/storage/v1/b/podio-reader-storage/o?uploadType=media&alt=json&name=phones_schema.json returned "[email protected] does not have storage.objects.create access to podio-reader-storage/phones_schema.json.">
My airflow connection to Google Cloud Platform is:
Does anyone know how to get around this?