4
votes

I want to set checkpoint Interval for my python spark streaming scripts, base on the official documentation:

For stateful transformations that require RDD checkpointing, the default interval is a multiple of the batch interval that is at least 10 seconds. It can be set by using dstream.checkpoint(checkpointInterval). Typically, a checkpoint interval of 5 - 10 times of sliding interval of a DStream is good setting to try.

my scripts:

import sys

from pyspark import SparkContext
from pyspark.streaming import StreamingContext
from pyspark.streaming.kafka import KafkaUtils

def functionToCreateContext():
    sc = SparkContext(appName="PythonStreamingDirectKafkaWordCount")
    ssc = StreamingContext(sc, 6)
    ssc.checkpoint("./checkpoint")
    kvs = KafkaUtils.createDirectStream(ssc, ['test123'], {"metadata.broker.list": "localhost:9092"})

    kvs = kvs.checkpoint(60) #set the checkpoint interval

    lines = kvs.map(lambda x: x[1])
    counts = lines.flatMap(lambda line: line.split(" ")) \
        .map(lambda word: (word, 1)) \
        .reduceByKey(lambda a, b: a+b)
    counts.pprint()
    return ssc

if __name__ == "__main__":
    ssc = StreamingContext.getOrCreate("./checkpoint", functionToCreateContext)

    ssc.start()
    ssc.awaitTermination()

the output after running the script:

16/05/25 17:49:03 INFO DirectKafkaInputDStream: Slide time = 6000 ms
16/05/25 17:49:03 INFO DirectKafkaInputDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/25 17:49:03 INFO DirectKafkaInputDStream: Checkpoint interval = null
16/05/25 17:49:03 INFO DirectKafkaInputDStream: Remember duration = 120000 ms
16/05/25 17:49:03 INFO DirectKafkaInputDStream: Initialized and validated org.apache.spark.streaming.kafka.DirectKafkaInputDStream@1be80174
16/05/25 17:49:03 INFO PythonTransformedDStream: Slide time = 6000 ms
16/05/25 17:49:03 INFO PythonTransformedDStream: Storage level = StorageLevel(false, true, false, false, 1)
16/05/25 17:49:03 INFO PythonTransformedDStream: Checkpoint interval = 60000 ms
16/05/25 17:49:03 INFO PythonTransformedDStream: Remember duration = 120000 ms
16/05/25 17:49:03 INFO PythonTransformedDStream: Initialized and validated org.apache.spark.streaming.api.python.PythonTransformedDStream@69f9a089
16/05/25 17:49:03 INFO PythonTransformedDStream: Slide time = 6000 ms
16/05/25 17:49:03 INFO PythonTransformedDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/25 17:49:03 INFO PythonTransformedDStream: Checkpoint interval = null
16/05/25 17:49:03 INFO PythonTransformedDStream: Remember duration = 6000 ms
16/05/25 17:49:03 INFO PythonTransformedDStream: Initialized and validated org.apache.spark.streaming.api.python.PythonTransformedDStream@d97386a
16/05/25 17:49:03 INFO PythonTransformedDStream: Slide time = 6000 ms
16/05/25 17:49:03 INFO PythonTransformedDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/25 17:49:03 INFO PythonTransformedDStream: Checkpoint interval = null
16/05/25 17:49:03 INFO PythonTransformedDStream: Remember duration = 6000 ms
16/05/25 17:49:03 INFO PythonTransformedDStream: Initialized and validated org.apache.spark.streaming.api.python.PythonTransformedDStream@16c474ad
16/05/25 17:49:03 INFO ForEachDStream: Slide time = 6000 ms
16/05/25 17:49:03 INFO ForEachDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/25 17:49:03 INFO ForEachDStream: Checkpoint interval = null
16/05/25 17:49:03 INFO ForEachDStream: Remember duration = 6000 ms
..........

the DStream checkpoint interval is still null. Any idea for this?

1

1 Answers

3
votes

Try moving this line a few lines down to after you've created the stream: ssc.checkpoint("./checkpoint")

Basically, do the checkpoint after you've fully prepared your stream.