You can add a shutdown hook to your JVM and request spark streaming context
to stop gracefully.
/**
* Shutdown hook to shutdown JVM gracefully
* @param ssCtx
*/
def addShutdownHook(ssCtx: StreamingContext) = {
Runtime.getRuntime.addShutdownHook( new Thread() {
override def
Please make sure that you have enough memory available on the driver node. If
there is not enough free memory on the driver node, then your application won't
start.
Pankaj
From: vaquar khan mailto:vaquar.k...@gmail.com>>
Date: Saturday, June 10, 2017 at 5:02 PM
To: Abdulfattah Safa mailto:fatta
Hi,
I have been trying to distribute Kafka topics among different instances of
same consumer group. I am using KafkaDirectStream API for creating DStreams.
After the second consumer group comes up, Kafka does partition rebalance and
then Spark driver of the first consumer dies with the followin