When you start a hive session in Dataproc you can add jars that live in a gcs bucket.add jar gs://my-bucket/serde.jar;
I don't want to have to add all the jars I need each time I start a hive session so I tried adding the jar paths to hive-site.xml in the hive.aux.jars.path property.
<property>
<name>hive.aux.jars.path</name>
<value>gs://my-bucket/serde.jar</value>
</property>
Then I get hit with this error when trying to start a hive session.Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS: file://gs, expected: file:///
Is there a way to add custom jars that live in a gcs bucket to the hive classpath or would I have to copy the jars from my bucket and update hive.aux.jars.path each time I create a dataproc cluster?
*edit
Even after adding the below property and restarting hive I still get the same error.
<property>
<name>hive.exim.uri.scheme.whitelist</name>
<value>hdfs,pfile,gs</value>
<final>false</final>
</property>