I keep seeing that Apache Spark schedules series of stages with a fixed 200 tasks involved. Since this keeps happening to a number of different jobs I am guessing this is somehow related to one of Spark configurations. Any suggestion what that configuration might be?