UPDATE:
I've just discovered (maybe it's something new) that you can access an Env Variable with the bucket. This is defined automatically in Composer.
COMPOSER_BUCKET = os.environ["GCS_BUCKET"]
ORIGINAL:
I'm not 100% sure if you want to do this dynamically (i.e., the same DAG would work in other Composer env without any modification), either way, this is what I thought of:
(No dynamically) You can check the bucket that Composer uses clicking in environment , it should be under "DAGs folder" (it actually is the folder where the DAGs are, just take out /dags
)
(Dynamically) Since what you want is to copy files from Composer to GCS, you could use the FileToGoogleCloudStorageOperator and use file the is mapped to the Composer Bucket. Note that the local storage and Composer bucket are mapped to each other, so it would be "the same" to access path home/airflow/gcs/data/file1
than gs://<bucket>/data/file1
.
(Semi-Dynamically) You can use the Composer API to get the environment details and parse the bucket. Of course, you will need to know the name, location and project beforehand.
Out of this three, I'd say the one that uses the FileToGoogleCloudStorageOperator is the cleanest and easiest.