I am reading some customer records from lookup and writing it into bigquery table, Then from that same table I am reading some required data field and trying to push that data (Json) as a message into pubsub using dataflow pipeline in batch mode. But getting error : "ValueError: Cloud Pub/Sub is currently available for use only in streaming pipelines".
delete_rows = p | 'reading data to be deleted' >> beam.io.Read(
beam.io.BigQuerySource(
query=delete_query,
use_standard_sql=True))
required_data = delete_rows | 'Retriving only required data' >> beam.ParDo(RequiredData())
push_to_pubsub = required_data | 'Pushing data to pubsub' >> beam.io.WriteToPubSub(
topic='my topic name',
with_attributes=False,
id_label=None,
timestamp_attribute=None
)
I would like to use PubSub in batch mode of dataflow pipeline