0
votes

In a streaming dataflow pipeline, how can I dynamically change the bucket or the prefix of data I write to cloud storage?

For example, I would like to store data to text or avro files on GCS, but with a prefix that includes the processing hour.

Update: The question is invalid because there simply is no sink you can use in streaming dataflow that writes to Google Cloud Storage.

1
Streaming Dataflow doesn't have a built-in GCS sink; what are you currently using to do the write?danielm
You are right, there is no way of doing streaming GCS write.Erik Forsberg

1 Answers

1
votes

Google Cloud Dataflow currently does not allow GCS sinks in streaming mode.