I have a dataflow written in Python that I am trying to run on GCP. The dataflow keeps terminating with:
Workflow failed. Causes: Unknown message code.
The main code in my dataflow pipeline is:
schema = 'Member_ID:INTEGER,First_Name:STRING,Last_Name:STRING,Gender:STRING,Age:INTEGER,Height:STRING,weight:INTEGER,Hours_Sleep:INTEGER,Calories_Consumed:INTEGER,Calories_Burned:INTEGER,Evt_Date:DATE,Height_Inches:INTEGER,Min_Sleep_Hours:INTEGER,Max_Sleep_Hours:INTEGER,Enough_Sleep:BOOL'
# read, transform and local source data
p = beam.Pipeline(options=options)
# Read from PubSub into a PCollection.
events = (p | 'Read PubSub' >> beam.io.ReadFromPubSub (topic='projects/prefab-envoy-220213/topics/health_event')
| 'Parse CSV' >> beam.ParDo(getCSVFields())
| 'Convert Types' >> beam.ParDo(ConvDataTypes())
| 'Convert Height' >> beam.ParDo(ConvHeight())
| 'Join CDC Sleep' >> beam.ParDo(CDCSleepJoin(), cdcsleep)
| 'Create Row' >> beam.ParDo(CreateRow())
| 'Write to BQ' >> beam.io.Write(beam.io.BigQuerySink(
'prefab-envoy-220213:nhcdata.nhcevents', schema=schema,
write_disposition=beam.io.BigQueryDisposition.WRITE_TRUNCATE,
create_disposition=beam.io.BigQueryDisposition.CREATE_IF_NEEDED))
)
results = p.run()
results.wait_until_finish()
If I delete the
| 'Write to BQ' >> beam.io.Write(beam.io.BigQuerySink(
'prefab-envoy-220213:nhcdata.nhcevents', schema=schema,
write_disposition=beam.io.BigQueryDisposition.WRITE_TRUNCATE,
create_disposition=beam.io.BigQueryDisposition.CREATE_IF_NEEDED))
then the dataflow starts ok.