What is the correct/recommended way to go about unloading data from Snowflake that is hosted on GCP to an AWS S3 bucket?
Is it the same process as unloading from Snowflake that is already hosted on AWS, to S3, as highlighted here?
Are there any additional security concerns or permissions that need to be granted?
Yes, the link carries most of the steps required to setup an integration between any Snowflake deployment to your own AWS account, for performing an unload operation.
Snowflake supports unloading across-clouds seamlessly as a use-case, with some egress charges.
You will need to permit Snowflake a supported mode of write access to the specific cloud storage account. Access to S3 from GCP will occur over the internet (HTTPS) from GCP's data-center IPs, and the bucket policy should be permitted to allow such an access if it is currently restricted beyond authentication requirements.
Can you simply connect and transfer data across clouds using JDBC/ODBC drivers for snowflake?
Yes. The statement execution is done by the Snowflake's virtual warehouses (i.e. service-side, not client-side) and COPY INTO <location>
is a regular statement supported also by its JDBC/ODBC drivers.