My use case includes creating an external table in Bigquery using Pyspark code. The data source is Google cloud storage bucket where JSON data is sitting. I am reading the JSON data into a data frame and want to create an external Bigquery table. As of now, the table is getting created but it is not an external one.
df_view.write\
.format("com.google.cloud.spark.bigquery")\
.option('table', 'xyz-abc-abc:xyz_zone.test_table_yyyy')\
.option("temporaryGcsBucket","abcd-xml-abc-warehouse")\
.save(mode='append',path='gs://xxxxxxxxx/')
P.S. - I am using spark-bigquery connector to achieve my goal.
Please let me know in case anyone has faced the same issue.