Is there way to set the expiration time on a BigQuery table when using Dataflow's BigQueryIO.Write
sink?
For example, I'd like something like this (see last line):
PCollection<TableRow> mainResults...
mainResults.apply(BigQueryIO.Write
.named("my-bq-table")
.to("PROJECT:dataset.table")
.withSchema(getBigQueryTableSchema())
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_TRUNCATE)
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED))
.withExpiration(1452030098l) //**this table should expire on 31st Jan
I can't see anything in the Dataflow API that would facilitate this. Of course, I could just use the BigQuery API, but it would be much better to be able to this in the via Dataflow when specifying the sink.