Currently I am running my Dataproc cluster in region europe. I am running spark application on same cluster. While writing to bucket using Google cloud storage connector in spark, buckets are automatically getting created with Multi-Regional class and with Multiple Regions in US properties.
I am writing file using
dataframe.write("gs://location").mode()...
This will create new bucket into location with properties mentioned above.
Tried to find configuration to set storage class in connector but no success. How we can resolve this.