Hi all,

I have a streaming application with batch interval 10 seconds.

    val sparkConf = new SparkConf().setAppName("RMQWordCount")
      .set("spark.streaming.stopGracefullyOnShutdown", "true")
    val ssc = new StreamingContext(sparkConf, Seconds(10))

I also use reduceByKeyAndWindow() API for aggregation at window interval
of 5 minutes.

But when I send a SIGTERM to the streaming process at around 4th minute,
I don't see reduceByKeyAndWindow() action taking place. But the data is
already read for 4 minutes. I thought graceful shutdown would trigger
the action with received messages.

Am I missing something?

Thanks and regards
Noorul

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to