I have a dataflow job which I am trying to 'drain'. Explanation of drain option says that
Dataflow will cease all data ingestion, but will attempt to finish processing any remaining buffered data. Pipeline resources will be maintained until buffered data has finished processing and any pending output has finished writing.
But data ingestion does not seem to stop. The Elements added count is still increasing and the job hasn't stopped for over an hour now. Is this expected behavior? I am using Pub/Sub source if that helps.
EDIT:
Here is the job ID - 2017-10-30_19_59_30-14251132252018661885
Elements addedkeeps on increasing in the very firstPubsubIO.Readstep even after I pressdrain. That step does not contain any code I wrote, it's a simplePubsubIO.readStrings().fromSubscription()command. Thanks! - Kakaji