When I write to a partitioned table in bigquery from dataflow, I'm getting the following error - could any one help me with this?
Invalid table ID \"test$20181126\". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.
This is the python snippet I'm using for writing
import apache_beam as beam
class bqwriter(beam.PTransform):
def __init__(self, table, schema):
super(BQWriter, self).__init__()
self.table = table
self.schema = schema
def expand(self, pcoll):
pcoll | beam.io.Write(beam.io.BigQuerySink(
self.table,
schema=self.schema,
create_disposition=beam.io.BigQueryDisposition.CREATE_IF_NEEDED,
write_disposition=beam.io.BigQueryDisposition.WRITE_TRUNCATE
))
Im creating tabe like below
a | 'BQWrite' >> BQWriter("test-123:test.test$20181126", table_schema)