We have a pipeline that will copy tables from BigQuery to CloudSql.
Cloud SQL table creation was happening outside dataflow.
Now we need to have the table creation in Dataflow.
I have the tables to be created in GCS bucket as .sql file.
Below is the code snippet which copies table from BQ to sql.
p.apply(BigQueryIO.readTableRows()
.from(source_table)
.withTemplateCompatibility()
.withoutValidation())
.apply(JdbcIO.<TableRow>write()
.withDataSourceConfiguration(
JdbcIO.DataSourceConfiguration.create(
"org.postgresql.Driver",
base_url
)
)
.withStatement("INSERT INTO " + target_table.split("\\.")[1] + " VALUES " + insert_query)
.withPreparedStatementSetter(new StatementSetter(some_map)));
// p.run().waitUntilFinish(); // this will process tables one after the other i.e. sequential execution p.run();
Is there a way I can execute the .sql file using JDBCIO?