I have an issue regarding goolge dataflow.
I'm writing a dataflow pipeline which reads data from PubSub, and write to BigQuery, it's works.
Now I have to handle late data and i was following some examples on intenet but it's not working properly, here is my code:
pipeline.apply(PubsubIO.readStrings()
.withTimestampAttribute("timestamp").fromSubscription(Constants.SUBSCRIBER))
.apply(ParDo.of(new ParseEventFn()))
.apply(Window.<Entity> into(FixedWindows.of(WINDOW_SIZE))
// processing of late data.
.triggering(
AfterWatermark
.pastEndOfWindow()
.withEarlyFirings(
AfterProcessingTime
.pastFirstElementInPane()
.plusDelayOf(DELAY_SIZE))
.withLateFirings(AfterPane.elementCountAtLeast(1)))
.withAllowedLateness(ALLOW_LATE_SIZE)
.accumulatingFiredPanes())
.apply(ParDo.of(new ParseTableRow()))
.apply("Write to BQ", BigQueryIO.<TableRow>write()...
Here is my pubsub message:
{
...,
"timestamp" : "2015-08-31T09:52:25.005Z"
}
When I manually push some messages(go to PupsubTopic and publish) with timestamp is << ALLOW_LATE_SIZE but these messages are still passed.